Science.gov

Sample records for reload design process

  1. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  2. Modeling and design of a reload PWR core for a 48-month fuel cycle

    SciTech Connect

    McMahon, M.V.; Driscoll, M.J.; Todreas, N.E.

    1997-05-01

    The objective of this research was to use state-of-the-art nuclear and fuel performance packages to evaluate the feasibility and costs of a 48 calendar month core in existing pressurized water reactor (PWR) designs, considering the full range of practical design and economic considerations. The driving force behind this research is the desire to make nuclear power more economically competitive with fossil fuel options by expanding the scope for achievement of higher capacity factors. Using CASMO/SIMULATE, a core design with fuel enriched to 7{sup w}/{sub o} U{sup 235} for a single batch loaded, 48-month fuel cycle has been developed. This core achieves an ultra-long cycle length without exceeding current fuel burnup limits. The design uses two different types of burnable poisons. Gadolinium in the form of gadolinium oxide (Gd{sub 2}O{sub 3}) mixed with the UO{sub 2} of selected pins is sued to hold down initial reactivity and to control flux peaking throughout the life of the core. A zirconium di-boride (ZrB{sub 2}) integral fuel burnable absorber (IFBA) coating on the Gd{sub 2}O{sub 3}-UO{sub 2} fuel pellets is added to reduce the critical soluble boron concentration in the reactor coolant to within acceptable limits. Fuel performance issues of concern to this design are also outlined and areas which will require further research are highlighted.

  3. From Reload to ReCourse: Learning from IMS Learning Design Implementations

    ERIC Educational Resources Information Center

    Griffiths, David; Beauvoir, Phillip; Liber, Oleg; Barrett-Baxendale, Mark

    2009-01-01

    The use of the Web to deliver open, distance, and flexible learning has opened up the potential for social interaction and adaptive learning, but the usability, expressivity, and interoperability of the available tools leave much to be desired. This article explores these issues as they relate to teachers and learning designers through the case of…

  4. Whorf Reloaded: Language Effects on Nonverbal Number Processing in First Grade--A Trilingual Study

    ERIC Educational Resources Information Center

    Pixner, S.; Moeller, K.; Hermanova, V.; Nuerk, H. -C.; Kaufmann, L.

    2011-01-01

    The unit-decade compatibility effect is interpreted to reflect processes of place value integration in two-digit number magnitude comparisons. The current study aimed at elucidating the influence of language properties on the compatibility effect of Arabic two-digit numbers in Austrian, Italian, and Czech first graders. The number word systems of…

  5. Optimal reload strategies for identify-and-destroy missions

    NASA Astrophysics Data System (ADS)

    Hyland, John C.; Smith, Cheryl M.

    2004-09-01

    In this problem an identification vehicle must re-acquire a fixed set of suspected targets and determine whether each suspected target is a mine or a false alarm. If a target is determined to be a mine, the identification vehicle must neutralize it by either delivering one of a limited number of on-board bombs or by assigning the neutralization task to one of a limited number of single-shot suicide vehicles. The identification vehicle has the option to reload. The singleshot suicide vehicles, however, cannot be replenished. We have developed an optimal path planning and reload strategy for this identify and destroy mission that takes into account the probabilities that suspected targets are mines, the costs to move between targets, the costs to return to and from the reload point, and the cost to reload. The mission is modeled as a discrete multi-dimensional Markov process. At each target position the vehicle decides based on the known costs, probabilities, the number of bombs on board (r), and the number of remaining one-shot vehicles (s) whether to move directly on to the next target or to reload before continuing and whether to destroy any mine with an on-board bomb or a one-shot suicide vehicle. The approach recursively calculates the minimum expected overall cost conditioned on all possible values r and s. The recursion is similar to dynamic programming in that it starts at the last suspected target location and works its way backwards to the starting point. The approach also uses a suboptimal traveling salesman strategy to search over candidate deployment locations to calculate the best initial deployment point where the reloads will take place.

  6. NASA reload program

    NASA Technical Reports Server (NTRS)

    Byington, Marshall

    1993-01-01

    Atlantic Research Corporation (ARC) contracted with NASA to manufacture and deliver thirteen small scale Solid Rocket Motors (SRM). These motors, containing five distinct propellant formulations, will be used for plume induced radiation studies. The information contained herein summarizes and documents the program accomplishments and results. Several modifications were made to the scope of work during the course of the program. The effort was on hold from late 1991 through August, 1992 while propellant formulation changes were developed. Modifications to the baseline program were completed in late-August and Modification No. 6 was received by ARC on September 14, 1992. The modifications include changes to the propellant formulation and the nozzle design. The required motor deliveries were completed in late-December, 1992. However, ARC agreed to perform an additional mix and cast effort at no cost to NASA and another motor was delivered in March, 1993.

  7. The Heliogyro Reloaded

    NASA Technical Reports Server (NTRS)

    Wilkie, William K.; Warren, Jerry E.; Thompson, M. W.; Lisman, P. D.; Walkemeyer, P. E.; Guerrant, D. V.; Lawrence, D. A.

    2011-01-01

    The heliogyro is a high-performance, spinning solar sail architecture that uses long - order of kilometers - reflective membrane strips to produce thrust from solar radiation pressure. The heliogyro s membrane blades spin about a central hub and are stiffened by centrifugal forces only, making the design exceedingly light weight. Blades are also stowed and deployed from rolls; eliminating deployment and packaging problems associated with handling extremely large, and delicate, membrane sheets used with most traditional square-rigged or spinning disk solar sail designs. The heliogyro solar sail concept was first advanced in the 1960s by MacNeal. A 15 km diameter version was later extensively studied in the 1970s by JPL for an ambitious Comet Halley rendezvous mission, but ultimately not selected due to the need for a risk-reduction flight demonstration. Demonstrating system-level feasibility of a large, spinning heliogyro solar sail on the ground is impossible; however, recent advances in microsatellite bus technologies, coupled with the successful flight demonstration of reflectance control technologies on the JAXA IKAROS solar sail, now make an affordable, small-scale heliogyro technology flight demonstration potentially feasible. In this paper, we will present an overview of the history of the heliogyro solar sail concept, with particular attention paid to the MIT 200-meter-diameter heliogyro study of 1989, followed by a description of our updated, low-cost, heliogyro flight demonstration concept. Our preliminary heliogyro concept (HELIOS) should be capable of demonstrating an order-of-magnitude characteristic acceleration performance improvement over existing solar sail demonstrators (HELIOS target: 0.5 to 1.0 mm/s2 at 1.0 AU); placing the heliogyro technology in the range required to enable a variety of science and human exploration relevant support missions.

  8. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  9. Insulin-like growth factor-1 receptor in mature osteoblasts is required for periosteal bone formation induced by reloading

    NASA Astrophysics Data System (ADS)

    Kubota, Takuo; Elalieh, Hashem Z.; Saless, Neema; Fong, Chak; Wang, Yongmei; Babey, Muriel; Cheng, Zhiqiang; Bikle, Daniel D.

    2013-11-01

    Skeletal loading and unloading has a pronounced impact on bone remodeling, a process also regulated by insulin-like growth factor-1 (IGF-1) signaling. Skeletal unloading leads to resistance to the anabolic effect of IGF-1, while reloading after unloading restores responsiveness to IGF-1. However, a direct study of the importance of IGF-1 signaling in the skeletal response to mechanical loading remains to be tested. In this study, we assessed the skeletal response of osteoblast-specific Igf-1 receptor deficient (Igf-1r-/-) mice to unloading and reloading. The mice were hindlimb unloaded for 14 days and then reloaded for 16 days. Igf-1r-/- mice displayed smaller cortical bone and diminished periosteal and endosteal bone formation at baseline. Periosteal and endosteal bone formation decreased with unloading in Igf-1r+/+ mice. However, the recovery of periosteal bone formation with reloading was completely inhibited in Igf-1r-/- mice, although reloading-induced endosteal bone formation was not hampered. These changes in bone formation resulted in the abolishment of the expected increase in total cross-sectional area with reloading in Igf-1r-/- mice compared to the control mice. These results suggest that the Igf-1r in mature osteoblasts has a critical role in periosteal bone formation in the skeletal response to mechanical loading.

  10. Reloading functionally ameliorates disuse-induced muscle atrophy by reversing mitochondrial dysfunction, and similar benefits are gained by administering a combination of mitochondrial nutrients.

    PubMed

    Liu, Jing; Peng, Yunhua; Feng, Zhihui; Shi, Wen; Qu, Lina; Li, Yinghui; Liu, Jiankang; Long, Jiangang

    2014-04-01

    We previously found that mitochondrial dysfunction occurs in disuse-induced muscle atrophy. However, the mitochondrial remodeling that occurs during reloading, an effective approach for rescuing unloading-induced atrophy, remains to be investigated. In this study, using a rat model of 3-week hindlimb unloading plus 7-day reloading, we found that reloading protected mitochondria against dysfunction, including mitochondrial loss, abnormal mitochondrial morphology, inhibited biogenesis, and activation of mitochondria-associated apoptotic signaling. Interestingly, a combination of nutrients, including α-lipoic acid, acetyl-L-carnitine, hydroxytyrosol, and CoQ10, which we designed to target mitochondria, was able to efficiently rescue muscle atrophy via a reloading-like action. It is suggested that reloading ameliorates skeletal muscle atrophy through the activation of mitochondrial biogenesis and the amelioration of oxidative stress. Nutrient administration acted similarly in unloaded rats. Here, the study of mitochondrial remodeling in rats during unloading and reloading provides a more detailed picture of the pathology of muscle atrophy. PMID:24418157

  11. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  12. Bassoon speeds vesicle reloading at a central excitatory synapse.

    PubMed

    Hallermann, Stefan; Fejtova, Anna; Schmidt, Hartmut; Weyhersmüller, Annika; Silver, R Angus; Gundelfinger, Eckart D; Eilers, Jens

    2010-11-18

    Sustained rate-coded signals encode many types of sensory modalities. Some sensory synapses possess specialized ribbon structures, which tether vesicles, to enable high-frequency signaling. However, central synapses lack these structures, yet some can maintain signaling over a wide bandwidth. To analyze the underlying molecular mechanisms, we investigated the function of the active zone core component Bassoon in cerebellar mossy fiber to granule cell synapses. We show that short-term synaptic depression is enhanced in Bassoon knockout mice during sustained high-frequency trains but basal synaptic transmission is unaffected. Fluctuation and quantal analysis as well as quantification with constrained short-term plasticity models revealed that the vesicle reloading rate was halved in the absence of Bassoon. Thus, our data show that the cytomatrix protein Bassoon speeds the reloading of vesicles to release sites at a central excitatory synapse.

  13. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  14. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  15. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  16. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  17. The motion after-effect reloaded

    PubMed Central

    Mather, George; Pavan, Andrea; Campana, Gianluca; Casco, Clara

    2011-01-01

    The motion after-effect is a robust illusion of visual motion resulting from exposure to a moving pattern. There is a widely accepted explanation of it in terms of changes in the response of cortical direction-selective neurons. Research has distinguished several variants of the effect. Converging recent evidence from different experimental techniques (psychophysics, single-unit recording, brain imaging, transcranial magnetic stimulation, and evoked potentials) reveals that adaptation is not confined to one or even two cortical areas, but involves up to five different sites, reflecting the multiple levels of processing involved in visual motion analysis. A tentative motion processing framework is described, based on motion after-effect research. Recent ideas on the function of adaptation see it as a form of gain control that maximises the efficiency of information transmission. PMID:18951829

  18. Myocardial Reloading after Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    SciTech Connect

    Kajimoto, Masaki; Priddy, Colleen M.; Ledee, Dolena; Xu, Chun; Isern, Nancy G.; Olson, Aaron; Des Rosiers, Christine; Portman, Michael A.

    2013-08-19

    Extracorporeal membrane oxygenation (ECMO) unloads the heart providing a bridge to recovery in children after myocardial stunning. Mortality after ECMO remains high.Cardiac substrate and amino acid requirements upon weaning are unknown and may impact recovery. We assessed the hypothesis that ventricular reloading modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Fourteen immature piglets (7.8-15.6 kg) were separated into 2 groups based on ventricular loading status: 8 hour-ECMO (UNLOAD) and post-wean from ECMO (RELOAD). We infused [2-13C]-pyruvate as an oxidative substrate and [13C6]-L-leucine, as a tracer of amino acid oxidation and protein synthesis into the coronary artery. RELOAD showed marked elevations in myocardial oxygen consumption above baseline and UNLOAD. Pyruvate uptake was markedly increased though RELOAD decreased pyruvate contribution to oxidative CAC metabolism.RELOAD also increased absolute concentrations of all CAC intermediates, while maintaining or increasing 13C-molar percent enrichment. RELOAD also significantly increased cardiac fractional protein synthesis rates by >70% over UNLOAD. Conclusions: RELOAD produced high energy metabolic requirement and rebound protein synthesis. Relative pyruvate decarboxylation decreased with RELOAD while promoting anaplerotic pyruvate carboxylation and amino acid incorporation into protein rather than to the CAC for oxidation. These perturbations may serve as therapeutic targets to improve contractile function after ECMO.

  19. Reloading Continuous GPS in Northwest Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez-Garcia, J. J.; Suarez-Vidal, F.; Gonzalez-Ortega, J. A.

    2007-05-01

    For more than 10 years we try to follow the steps of the Southern California Integrated GPS Network (SCIGN) and the Plate Boundary Observatory (PBO) in USA, this gives us the opportunity to be in position to contribute to develop a modern GPS Network in Mexico. During 1998 and 2001, three stations were deployed in Northwest Mexico in concert with the development of SCIGN: SPMX in north central Baja California state at the National Astronomical Observatory, UNAM in the Sierra San Pedro Martir; CORX in Isla Coronados Sur, offshore San Diego, Ca./Tijuana, Mexico and GUAX in Guadalupe island 150 miles offshore Baja California peninsula, which provide a unique site on the Pacific plate in the Northamerica/Pacific boundary zone in Las Californias. The former IGS station in CICESE, Ensenada, CICE installed in 1995, was replaced by CIC1 in 1999. In 2004 and 2005 with partial support from SCIGN and UNAVCO to University of Arizona a volunteer team from UNAVCO, Caltech, U.S. Geological Survey, Universidad de la Sierra at Moctezuma Sonora and CICESE built two new shallow-braced GPS sites in northwest Mexico. The first site USMX is located at east-central Sonora and the second YESX is located high in the Sierra Madre Occidental at Yecora near the southern border of Sonora and Chihuahua. All data is openly available at SOPAC and/or UNAVCO. The existing information has been valuable to resolve the "total" plate motion between the Pacific plate (GUAX) and the Northamerica plate (USMX and YESX) in the north- central Gulf of California. Since the last year we have the capability of GPS data processing using GAMIT/GLOBK, and after gain some practice with survey mode data processing we can convert us in a GPS processing center in Mexico. Currently only 2 sites are operational: CIC1 and USMX. With new energy we are ready to contribute to the establishment of a modern GPS network in Mexico for science, hazard monitoring and infrastructure.

  20. Reengineering the project design process

    NASA Astrophysics Data System (ADS)

    Kane Casani, E.; Metzger, Robert M.

    1995-01-01

    In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.

  1. The Snark was a Boojum - reloaded

    PubMed Central

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 (“The Snark was a Boojum”). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of “translational research” that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that

  2. Process simulation and design '94

    SciTech Connect

    Not Available

    1994-06-01

    This first-of-a-kind report describes today's process simulation and design technology for specific applications. It includes process names, diagrams, applications, descriptions, objectives, economics, installations, licensors, and a complete list of process submissions. Processes include: alkylation, aromatics extraction, catalytic reforming, cogeneration, dehydration, delayed coking, distillation, energy integration, catalytic cracking, gas sweetening, glycol/methanol injection, hydrocracking, NGL recovery and stabilization, solvent dewaxing, visbreaking. Equipment simulations include: amine plant, ammonia plant, heat exchangers, cooling water network, crude preheat train, crude unit, ethylene furnace, nitrogen rejection unit, refinery, sulfur plant, and VCM furnace. By-product processes include: olefins, polyethylene terephthalate, and styrene.

  3. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  4. A knowledge-based system for optimization of fuel reload configurations

    SciTech Connect

    Galperin, A.; Kimhi, S.; Segev, M. )

    1989-05-01

    The authors discuss a knowledge-based production system developed for generating optimal fuel reload configurations. The system was based on a heuristic search method and implemented in Common Lisp programming language. The knowledge base embodied the reactor physics, reactor operations, and a general approach to fuel management strategy. The data base included a description of the physical system involved, i.e., the core geometry and fuel storage. The fifth cycle of the Three Mile Island Unit 1 pressurized water reactor was chosen as a test case. Application of the system to the test case revealed a self-learning process by which a relatively large number of near-optimal configurations were discovered. Several selected solutions were subjected to detailed analysis and demonstrated excellent performance. To summarize, applicability of the proposed heuristic search method in the domain of nuclear fuel management was proved unequivocally.

  5. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  6. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  7. Levitation force relaxation under reloading in a HTS Maglev system

    NASA Astrophysics Data System (ADS)

    He, Qingyong; Wang, Jiasu; Wang, Suyu; Wang, Jiansi; Dong, Hao; Wang, Yuxin; Shao, Senhao

    2009-02-01

    The loading capacity of the high-temperature superconducting (HTS) Maglev vehicle is an important parameter in the practical application. It is closely related to the levitation force of the HTS bulk. Many papers reported that the levitation force showed the relaxation characteristic. Because different loads cause different levitation gaps and different applied magnetic fields, the levitation force relaxations under the different loads are not the same. In terms of cylindrical YBCO bulk levitated over the permanent magnetic guideway, the relationship between the levitation force relaxation and the reloading is investigated experimentally in this paper. The decrement, the decrement rate and the relaxation rate of the levitation force are calculated, respectively. This work might be helpful for studying the loading capacity of the HTS Maglev vehicle.

  8. Improvement of characteristic statistic algorithm and its application on equilibrium cycle reloading optimization

    SciTech Connect

    Hu, Y.; Liu, Z.; Shi, X.; Wang, B.

    2006-07-01

    A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)

  9. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  10. Reloading partly recovers bone mineral density and mechanical properties in hind limb unloaded rats

    NASA Astrophysics Data System (ADS)

    Zhao, Fan; Li, Dijie; Arfat, Yasir; Chen, Zhihao; Liu, Zonglin; Lin, Yu; Ding, Chong; Sun, Yulong; Hu, Lifang; Shang, Peng; Qian, Airong

    2014-12-01

    Skeletal unloading results in decreased bone formation and bone mass. During long-term space flight, the decreased bone mass is impossible to fully recover. Therefore, it is necessary to develop the effective countermeasures to prevent spaceflight-induced bone loss. Hindlimb Unloading (HLU) simulates effects of weightlessness and is utilized extensively to examine the response of musculoskeletal systems to certain aspects of space flight. The purpose of this study is to investigate the effects of a 4-week HLU in rats and subsequent reloading on the bone mineral density (BMD) and mechanical properties of load-bearing bones. After HLU for 4 weeks, the rats were then subjected to reloading for 1 week, 2 weeks and 3 weeks, and then the BMD of the femur, tibia and lumbar spine in rats were assessed by dual energy X-ray absorptiometry (DXA) every week. The mechanical properties of the femur were determined by three-point bending test. Dry bone and bone ash of femur were obtained through Oven-Drying method and were weighed respectively. Serum alkaline phosphatase (ALP) and serum calcium were examined through ELISA and Atomic Absorption Spectrometry. The results showed that 4 weeks of HLU significantly decreased body weight of rats and reloading for 1 week, 2 weeks or 3 weeks did not recover the weight loss induced by HLU. However, after 2 weeks of reloading, BMD of femur and tibia of HLU rats partly recovered (+10.4%, +2.3%). After 3 weeks of reloading, the reduction of BMD, energy absorption, bone mass and mechanical properties of bone induced by HLU recovered to some extent. The changes in serum ALP and serum calcium induced by HLU were also recovered after reloading. Our results indicate that a short period of reloading could not completely recover bone after a period of unloading, thus some interventions such as mechanical vibration or pharmaceuticals are necessary to help bone recovery.

  11. Muscle regeneration during hindlimb unloading results in a reduction in muscle size after reloading

    NASA Technical Reports Server (NTRS)

    Mozdziak, P. E.; Pulvermacher, P. M.; Schultz, E.

    2001-01-01

    The hindlimb-unloading model was used to study the ability of muscle injured in a weightless environment to recover after reloading. Satellite cell mitotic activity and DNA unit size were determined in injured and intact soleus muscles from hindlimb-unloaded and age-matched weight-bearing rats at the conclusion of 28 days of hindlimb unloading, 2 wk after reloading, and 9 wk after reloading. The body weights of hindlimb-unloaded rats were significantly (P < 0.05) less than those of weight-bearing rats at the conclusion of hindlimb unloading, but they were the same (P > 0.05) as those of weight-bearing rats 2 and 9 wk after reloading. The soleus muscle weight, soleus muscle weight-to-body weight ratio, myofiber diameter, number of nuclei per millimeter, and DNA unit size were significantly (P < 0.05) smaller for the injured soleus muscles from hindlimb-unloaded rats than for the soleus muscles from weight-bearing rats at each recovery time. Satellite cell mitotic activity was significantly (P < 0.05) higher in the injured soleus muscles from hindlimb-unloaded rats than from weight-bearing rats 2 wk after reloading, but it was the same (P > 0.05) as in the injured soleus muscles from weight-bearing rats 9 wk after reloading. The injured soleus muscles from hindlimb-unloaded rats failed to achieve weight-bearing muscle size 9 wk after reloading, because incomplete compensation for the decrease in myonuclear accretion and DNA unit size expansion occurred during the unloading period.

  12. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  13. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  14. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  15. Application of a heuristic search method for generation of fuel reload configurations

    SciTech Connect

    Galperin, A.; Nissan, E. )

    1988-08-01

    A computerized heuristic search method for the generation and optimization of fuel reload configurations is proposed and investigated. The heuristic knowledge is expressed modularly in the form of ''IF-THEN'' production rules. The method was implemented in a program coded in the Franz LISP programming language and executed under the UNIX operating system. A test problem was formulated, based on a typical light water reactor reload problem with a few simplifications assumed, in order to allow formulation of the reload strategy into a relatively small number of rules. A computer run of the problem was performed with a VAX-780 machine. A set of 312 solutions was generated in -- 20 min of execution time. Testing of a few arbitrarily chosen configurations demonstrated reasonably good performance for the computer-generated solutions. A computerized generator of reload configurations may be used for the fast generation or modification of reload patterns and as a tool for the formulation, tuning, and testing of the heuristic knowledge rules used by an ''expert'' fuel manager.

  16. Unloading and reloading working memory: attending to one item frees capacity.

    PubMed

    Souza, Alessandra S; Rerko, Laura; Oberauer, Klaus

    2014-06-01

    During the retention interval of a working memory task, presenting a retro-cue directs attention to 1 of the items in working memory. Testing the cued item leads to faster and more accurate responses. We contrasted 5 explanations of this benefit: (a) removal of noncued items, (b) strengthening of the cued item, (c) protection from probe interference, (d) protection from degradation, and (e) prioritization during the decision process. Experiment 1 showed that retro-cues reduced the set size effect in a visual recognition task, and did so increasingly with more time available to use the retro-cue. This finding is predicted only by Hypotheses 1 and 2. Hypotheses 3 through 5 were ruled out as explanations of the retro-cue benefit in this experiment. In Experiments 2 and 3, participants encoded 2 sequentially presented memory sets. In half of the trials, 1 item from the first set was retro-cued during the interset interval. Retro-cues improved memory for the second set. This reloading benefit is predicted only by the removal hypothesis: Irrelevant contents are removed from working memory, freeing capacity to encode new contents. Experiment 3 also yielded evidence that strengthening of the cued item might contribute to the retro-cue effect. PMID:24730737

  17. Hafnium transistor process design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2009-01-01

    A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.

  18. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  19. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  20. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  1. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  2. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  3. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  4. Myocardial Reloading After Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    PubMed Central

    Kajimoto, Masaki; O'Kelly Priddy, Colleen M.; Ledee, Dolena R.; Xu, Chun; Isern, Nancy; Olson, Aaron K.; Rosiers, Christine Des; Portman, Michael A.

    2013-01-01

    Background Extracorporeal membrane oxygenation (ECMO) unloads the heart, providing a bridge to recovery in children after myocardial stunning. ECMO also induces stress which can adversely affect the ability to reload or wean the heart from the circuit. Metabolic impairments induced by altered loading and/or stress conditions may impact weaning. However, cardiac substrate and amino acid requirements upon weaning are unknown. We assessed the hypothesis that ventricular reloading with ECMO modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Methods and Results Sixteen immature piglets (7.8 to 15.6 kg) were separated into 2 groups based on ventricular loading status: 8‐hour ECMO (UNLOAD) and postwean from ECMO (RELOAD). We infused into the coronary artery [2‐13C]‐pyruvate as an oxidative substrate and [13C6]‐L‐leucine as an indicator for amino acid oxidation and protein synthesis. Upon RELOAD, each functional parameter, which were decreased substantially by ECMO, recovered to near‐baseline level with the exclusion of minimum dP/dt. Accordingly, myocardial oxygen consumption was also increased, indicating that overall mitochondrial metabolism was reestablished. At the metabolic level, when compared to UNLOAD, RELOAD altered the contribution of various substrates/pathways to tissue pyruvate formation, favoring exogenous pyruvate versus glycolysis, and acetyl‐CoA formation, shifting away from pyruvate decarboxylation to endogenous substrate, presumably fatty acids. Furthermore, there was also a significant increase of tissue concentrations for all CAC intermediates (≈80%), suggesting enhanced anaplerosis, and of fractional protein synthesis rates (>70%). Conclusions RELOAD alters both cytosolic and mitochondrial energy substrate metabolism, while favoring leucine incorporation into protein synthesis rather than oxidation in the CAC. Improved understanding of factors governing these metabolic perturbations may

  5. Intracellular Ca2+ transients in mouse soleus muscle after hindlimb unloading and reloading

    NASA Technical Reports Server (NTRS)

    Ingalls, C. P.; Warren, G. L.; Armstrong, R. B.; Hamilton, S. L. (Principal Investigator)

    1999-01-01

    The objective of this study was to determine whether altered intracellular Ca(2+) handling contributes to the specific force loss in the soleus muscle after unloading and/or subsequent reloading of mouse hindlimbs. Three groups of female ICR mice were studied: 1) unloaded mice (n = 11) that were hindlimb suspended for 14 days, 2) reloaded mice (n = 10) that were returned to their cages for 1 day after 14 days of hindlimb suspension, and 3) control mice (n = 10) that had normal cage activity. Maximum isometric tetanic force (P(o)) was determined in the soleus muscle from the left hindlimb, and resting free cytosolic Ca(2+) concentration ([Ca(2+)](i)), tetanic [Ca(2+)](i), and 4-chloro-m-cresol-induced [Ca(2+)](i) were measured in the contralateral soleus muscle by confocal laser scanning microscopy. Unloading and reloading increased resting [Ca(2+)](i) above control by 36% and 24%, respectively. Although unloading reduced P(o) and specific force by 58% and 24%, respectively, compared with control mice, there was no difference in tetanic [Ca(2+)](i). P(o), specific force, and tetanic [Ca(2+)](i) were reduced by 58%, 23%, and 23%, respectively, in the reloaded animals compared with control mice; however, tetanic [Ca(2+)](i) was not different between unloaded and reloaded mice. These data indicate that although hindlimb suspension results in disturbed intracellular Ca(2+) homeostasis, changes in tetanic [Ca(2+)](i) do not contribute to force deficits. Compared with unloading, 24 h of physiological reloading in the mouse do not result in further changes in maximal strength or tetanic [Ca(2+)](i).

  6. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  7. Instructional Design Processes and Traditional Colleges

    ERIC Educational Resources Information Center

    Vasser, Nichole

    2010-01-01

    Traditional colleges who have implemented distance education programs would benefit from using instructional design processes to develop their courses. Instructional design processes provide the framework for designing and delivering quality online learning programs in a highly-competitive educational market. Traditional college leaders play a…

  8. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  9. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  10. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  11. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  12. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  13. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. PMID:25959313

  14. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application.

  15. Transonic empirical configuration design process

    NASA Technical Reports Server (NTRS)

    Whitcomb, R. T.

    1983-01-01

    This lecture describes some of the experimental research pertaining to transonic configuration development conducted by the Transonic Aerodynamics Branch of the NASA Langley Research Center. Discussions are presented of the following: use of florescent oil films for the study of surface boundary layer flows; the severe effect of wind tunnel wall interference on the measured configuration drag rise near the speed of sound as determined by a comparison between wind tunnel and free air results; the development of a near sonic transport configuration incorporating a supercritical wing and an indented fuselage, designed on the basis of the area rule with a modification to account for the presence of local supersonic flow above the wing; a device for improving the transonic pitch up of swept wings with very little added drag at the cruise condition; a means for reducing the large transonic aerodynamic interference between the wing, fuselage, nacelle and pylon for a for a fuselage mounted nacelle having the inlet above the wing; and methods for reducing the transonic interference between flows over a winglet and the wing.

  16. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  17. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  18. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  19. Application of computer visualization to core reactor physics analysis and design

    SciTech Connect

    Zhu, Z.; Kropaczek, D.J.; Turinsky, P.J. )

    1993-01-01

    Advances in graphical user interface (GUI) technology have been accompanied by increasingly sophisticated software applications. This is particularly evident in the area of nuclear fuel management, where the almost complete automation of the reload design process should be realized in the near future. Beyond simple automation, however, the exploitation of visualization capabilities permitted with current technology will allow the reload design engineer to make more informed decisions through broader access to the global picture of parameter interaction. Herein lies the key concept of visualization being incorporated into the System for Integrated Nuclear Fuel Management (SINFUL-MAN) under development at North Carolina State University-the comprehension of multi-dimensionality made possible through the use of such GUI tools as solid rendering, transparency, and cutting planes.

  20. WORKSHOP ON ENVIRONMENTALLY CONSCIOUS CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    To encourage the consideration of environmental issues during chemical process design, the USEPA has developed techniques and software tools to evaluate the relative environmental impact of a chemical process. These techniques and tools aid in the risk management process by focus...

  1. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  2. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  3. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  4. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition

    PubMed Central

    Imbir, Kamil K.

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples. PMID:27486423

  5. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition.

    PubMed

    Imbir, Kamil K

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples. PMID:27486423

  6. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  7. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... designation regulations to provide for changes in the designation process (76 FR 70368-70374). In general... the comments. Definitions Comment: Removing the list of examples of unusual and adverse weather... disaster as an unusual or severe weather condition or other natural phenomena that causes severe...

  8. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy.

  9. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  10. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  11. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  12. Agonist-sensitive calcium pool in the pancreatic acinar cell. II. Characterization of reloading

    SciTech Connect

    Muallem, S.; Schoeffield, M.S.; Fimmel, C.J.; Pandol, S.J.

    1988-08-01

    45Ca2+ fluxes and free cytosolic Ca2+ measurements in guinea pig pancreatic acini indicated that after agonist stimulation and the release of Ca2+ from the agonist-sensitive pool at least part of the Ca2+ is extruded from the cell, resulting in 45Ca2+ efflux. In the continued presence of agonist, the pool remains permeable to Ca2+ but partially refills with Ca2+. This reloading is dependent on the concentration of extracellular Ca2+. In the absence of extracellular Ca2+, the pool is completely depleted of Ca2+. However, with increasing concentrations of CaCl2 in the incubation solution (from 0.5 to 2.0 mM) there is increasing repletion of the pool with Ca2+ during agonist stimulation. With termination of agonist stimulation, the Ca2+ permeability of the agonist-sensitive pool is rapidly reduced to that measured in the unstimulated cell. As a result, the Ca2+ incorporated into the pool during the stimulation period is rapidly trapped within the pool and exchanges poorly with medium Ca2+. Subsequently, the pool completely refills with Ca2+. The rate of Ca2+ reloading at the termination of agonist stimulation is slower than the conversion of the pool to the impermeable state. In incubation media containing 1.3 mM CaCl2, the half-time for reloading at the termination of stimulation is 5 min. These observations demonstrate the characteristics of Ca2+ reloading of the agonist-sensitive pool both during stimulation and at the termination of stimulation.

  13. Logical Reloading. What is it and What is a Profit from it?

    NASA Astrophysics Data System (ADS)

    Rylov, Yuri A.

    2014-07-01

    Logical reloading is a replacement of basic statements of a conception by equivalent statements of the same conception. The logical reloading does not change the conception, but it changes the mathematical formalism and changes results of this conception generalization. In the paper two examples of the logical reloading are considered. (1) Generalization of the deterministic particle dynamics on the case of the stochastic particle dynamics. As a result the unified formalism for description of particles of all kinds appears. This formalism admits one to explain freely quantum dynamics in terms of the classical particle dynamics. In particular, one discovers κ-field responsible for pair production. (2) Generalization of the proper Euclidean geometry which contains such space-time geometries, where free particles move stochastically. As a result such a conception of elementary particle dynamics arises, where one can investigate the elementary particles arrangement, but not only systematize elementary particles, ascribing quantum numbers to them. Besides, one succeeds to expand the general relativity on the non-Riemannian space-time geometries.

  14. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  15. Functionally graded materials: Design, processing and applications

    SciTech Connect

    Miyamoto, Y.; Kaysser, W.A.; Rabin, B.H.; Kawasaki, A.; Ford, R.G.

    1999-09-01

    In a Functionally Graded Material (FGM), the composition and structure gradually change over volume, resulting in corresponding changes in the properties of the material. By applying the many possibilities inherent in the FGM concept, it is anticipated that materials will be improved and new functions for them created. A comprehensive description of design, modeling, processing, and evaluation of FGMs as well as their applications is covered in this book. The contents include: lessons from nature; graded microstructures; modeling and design; characterization of properties; processing and fabrication; applications; and summary and outlook.

  16. Teaching sustainable design: A collaborative process

    SciTech Connect

    Theis, C.C.

    1997-12-31

    This paper describes a collaborative educational experience in the Schools of Architecture and Landscape Architecture at Louisiana State University. During the Fall Semester of 1996 an upper-level architectural design studio worked with a peer group of landscape architecture students on the design of a master plan for an environmentally sensitive residential development on Cat Island, a barrier island located approximately eight miles south of Gulfport, Mississippi. This paper presents the methodology and results of the project, describes the collaborative process, and assesses both the viability of the design solutions and the value of the educational experience.

  17. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  18. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  19. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  20. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  1. Designing Instruction That Supports Cognitive Learning Processes

    PubMed Central

    Clark, Ruth; Harrelson, Gary L.

    2002-01-01

    Objective: To provide an overview of current cognitive learning processes, including a summary of research that supports the use of specific instructional methods to foster those processes. We have developed examples in athletic training education to help illustrate these methods where appropriate. Data Sources: Sources used to compile this information included knowledge base and oral and didactic presentations. Data Synthesis: Research in educational psychology within the past 15 years has provided many principles for designing instruction that mediates the cognitive processes of learning. These include attention, management of cognitive load, rehearsal in working memory, and retrieval of new knowledge from long-term memory. By organizing instruction in the context of tasks performed by athletic trainers, transfer of learning and learner motivation are enhanced. Conclusions/Recommendations: Scientific evidence supports instructional methods that can be incorporated into lesson design and improve learning by managing cognitive load in working memory, stimulating encoding into long-term memory, and supporting transfer of learning. PMID:12937537

  2. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  3. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  4. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  5. Mimicry of natural material designs and processes

    SciTech Connect

    Bond, G.M.; Richman, R.H.; McNaughton, W.P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  6. A survey of the Oyster Creek reload licensing model

    SciTech Connect

    Alammar, M.A. )

    1991-01-01

    The Oyster Creek RETRAN licensing model was submitted for approval by the U.S. Nuclear Regulatory Commission in September 1987. This paper discusses the technical issues and concerns that were raised during the review process and how they were resolved. The technical issues are grouped into three major categories: the adequacy of the model benchmark against plant data; uncertainty analysis and model convergence with respect to various critical parameters (code correlations, nodalization, time step, etc.); and model application and usage.

  7. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  8. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  9. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  10. Moral judgment reloaded: a moral dilemma validation study

    PubMed Central

    Christensen, Julia F.; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K.; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set. PMID:25071621

  11. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  12. SETI reloaded: Next generation radio telescopes, transients and cognitive computing

    NASA Astrophysics Data System (ADS)

    Garrett, Michael A.

    2015-08-01

    The Search for Extra-terrestrial Intelligence (SETI) using radio telescopes is an area of research that is now more than 50 years old. Thus far, both targeted and wide-area surveys have yet to detect artificial signals from intelligent civilisations. In this paper, I argue that the incidence of co-existing intelligent and communicating civilisations is probably small in the Milky Way. While this makes successful SETI searches a very difficult pursuit indeed, the huge impact of even a single detection requires us to continue the search. A substantial increase in the overall performance of radio telescopes (and in particular future wide-field instruments such as the Square Kilometre Array - SKA), provide renewed optimism in the field. Evidence for this is already to be seen in the success of SETI researchers in acquiring observations on some of the world's most sensitive radio telescope facilities via open, peer-reviewed processes. The increasing interest in the dynamic radio sky, and our ability to detect new and rapid transient phenomena such as Fast Radio Bursts (FRB) is also greatly encouraging. While the nature of FRBs is not yet fully understood, I argue they are unlikely to be the signature of distant extra-terrestrial civilisations. As astronomers face a data avalanche on all sides, advances made in related areas such as advanced Big Data analytics, and cognitive computing are crucial to enable serendipitous discoveries to be made. In any case, as the era of the SKA fast approaches, the prospects of a SETI detection have never been better.

  13. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  14. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  15. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  16. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  17. Rolling Reloaded

    ERIC Educational Resources Information Center

    Jones, Simon A.; Nieminen, John M.

    2008-01-01

    Not so long ago a new observation about rolling motion was described: for a rolling wheel, there is a set of points with instantaneous velocities directed at or away from the centre of the wheel; these points form a circle whose diameter connects the centre of the wheel to the wheel's point of contact with the ground (Sharma 1996 "Eur. J. Phys."…

  18. Design Process Guide Method for Minimizing Loops and Conflicts

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    We propose a new guide method for developing an easy-to-design process for product development. This process ensures a smaller number of wasteful iterations and less multiple conflicts. The design process is modeled as a sequence of design decisions. A design decision is defined as the process of determination of product attributes. A design task is represented as a calculation flow that depends on the product constraints between the product attributes. We also propose an automatic planning algorithm for the execution of the design task, in order to minimize the design loops and design conflicts. Further, we validate the effectiveness of the proposed guide method by developing a prototype design system and a design example of piping for a power steering system. We find that the proposed method can successfully minimize design loops and design conflicts. This paper addresses (1) a design loop model, (2) a design conflict model, and (3) how to minimize design loops and design conflicts.

  19. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  20. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  1. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  2. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is the…

  3. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  4. Process variation analysis for MEMS design

    NASA Astrophysics Data System (ADS)

    Schenato, Luca; Wu, Wei-Chung; El Ghaoui, Laurent; Pister, Kristofer S. J.

    2001-03-01

    Process variations, incurred during the fabrication stage of MEMS structures, may lead to substantially different performance than the nominal one. This is mainly due to the small variation of the geometry of the structure with respect to the ideal design. In this paper we propose an approach to estimate performance variations for general planar suspended MEMS structure for low frequency applications. This approach is based on two complementary techniques, one probabilistic and the other deterministic. The former technique, based on the Monte-Carlo method, defines a random distribution on the geometric variables and evaluates the possible outcome performance by sampling that distribution. The latter technique, based on robust optimization and semidefinite programming (SDP) approximations te{EOL:98}, finds bounds on performance parameters given the bounds on the geometric variables, i.e. it considers the worst case scenario. Both techniques have been integrated with SUGAR, a simulation tool for MEMS devices available to the public te{Zhou98} te{Sito}, and tested on different types of folded springs.

  5. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  6. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  7. Process of system design and analysis

    SciTech Connect

    Gardner, B.

    1995-09-01

    The design of an effective physical protection system includes the determination of the physical protection system objectives, the initial design of a physical protection system, the evaluation of the design, and, probably, a redesign or refinement of the system. To develop the objectives, the designer must begin by gathering information about facility operations and conditions, such as a comprehensive description of the facility, operating states, and the physical protection requirements. The designer then needs to define the threat. This involves considering factors about potential adversaries: Class of adversary, adversary`s capabilities, and range of adversary`s tactics. Next, the designer should identify targets. Determination of whether or not nuclear materials are attractive targets is based mainly on the ease or difficulty of acquisition and desirability of the materiaL The designer now knows the objectives of the physical protection system, that is, ``What to protect against whom.`` The next step is to design the system by determining how best to combine such elements as fences, vaults, sensors, procedures, communication devices, and protective force personnel to meet the objectives of the system. Once a physical protection system is designed, it must be analyzed and evaluated to ensure it meets the physical protection objectives. Evaluation must allow for features working together to assure protection rather than regarding each feature separately. Due to the complexity of protection systems, an evaluation usually requires modeling techniques. If any vulnerabilities are found, the initial system must be redesigned to correct the vulnerabilities and a reevaluation conducted.

  8. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  9. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  10. Space bioreactor: Design/process flow

    NASA Technical Reports Server (NTRS)

    Cross, John H.

    1987-01-01

    The design of the space bioreactor stems from three considerations. First, and foremost, it must sustain cells in microgravity. Closely related is the ability to take advantage of the weightlessness and microgravity. Lastly, it should fit into a bioprocess. The design of the space bioreactor is described in view of these considerations. A flow chart of the bioreactor is presented and discussed.

  11. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  12. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats.

    PubMed

    Riley, D A

    1998-01-01

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  13. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats

    NASA Astrophysics Data System (ADS)

    Riley, D. A.

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  14. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  15. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  16. Knowledge and Processes in Design. DPS Final Report.

    ERIC Educational Resources Information Center

    Pirolli, Peter

    Four papers from a project concerning information-processing characterizations of the knowledge and processes involved in design are presented. The project collected and analyzed verbal protocols from instructional designers, architects, and mechanical engineers. A framework was developed for characterizing the problem spaces of design that…

  17. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  18. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  19. Optimality criteria design and stress constraint processing

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1982-01-01

    Methods for pre-screening stress constraints into either primary or side-constraint categories are reviewed; a projection method, which is developed from prior cycle stress resultant history, is introduced as an additional screening parameter. Stress resultant projections are also employed to modify the traditional stress-ratio, side-constraint boundary. A special application of structural modification reanalysis is applied to the critical stress constraints to provide feasible designs that are preferable to those obtained by conventional scaling. Sample problem executions show relatively short run times and fewer design cycle iterations to achieve low structural weights; those attained are comparable to the minimum values developed elsewhere.

  20. Erlang Behaviours: Programming with Process Design Patterns

    NASA Astrophysics Data System (ADS)

    Cesarini, Francesco; Thompson, Simon

    Erlang processes run independently of each other, each using separate memory and communicating with each other by message passing. These processes, while executing different code, do so following a number of common patterns. By examining different examples of Erlang-style concurrency in client/server architectures, we identify the generic and specific parts of the code and extract the generic code to form a process skeleton. In Erlang, the most commonly used patterns have been implemented in library modules, commonly referred to as OTP behaviours. They contain the generic code framework for concurrency and error handling, simplifying the complexity of concurrent programming and protecting the developer from many common pitfalls.

  1. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught at…

  2. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  3. Biochemical Engineering. Part II: Process Design

    ERIC Educational Resources Information Center

    Atkinson, B.

    1972-01-01

    Describes types of industrial techniques involving biochemical products, specifying the advantages and disadvantages of batch and continuous processes, and contrasting biochemical and chemical engineering. See SE 506 318 for Part I. (AL)

  4. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  5. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Video Gallery

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  6. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  7. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  8. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  9. Glucose uptake in rat soleus - Effect of acute unloading and subsequent reloading

    NASA Technical Reports Server (NTRS)

    Henriksen, Eric J.; Tischler, Marc E.

    1988-01-01

    The effect of acutely reduced weight bearing (unloading) on the in vitro uptake of 2-1,2-H-3-deoxy-D-glucose was studied in the soleus muscle by tail casting and suspending rats. After just 4 h, the uptake of 2-deoxy-D-glucose fell (-19 percent) and declined further after an additional 20 h of unloading. This diminution at 24 h was associated with slower oxidation of C-14-glucose and incorporation of C-14-glucose into glycogen. At 3 days of unloading, basal uptake of 2-deoxy-D-glucose did not differ from control. Reloading of the soleus after 1 or 3 days of unloading increased uptake of 2-deoxy-D-glucose above control and returned it to normal within 6 h and 4 days, respectively. These effects of unloading and recovery were caused by local changes in the soleus, because the extensor digitorum longus from the same hindlimbs did not display any alterations in uptake of 2-deoxy-D-glucose or metabolism of glucose.

  10. Possible involvement of 12-lipoxygenase activation in glucose-deprivation/reload-treated neurons.

    PubMed

    Nagasawa, Kazuki; Kakuda, Taichi; Higashi, Youichirou; Fujimoto, Sadaki

    2007-12-18

    The aim of this study was to clarify whether 12-lipoxygenase (12-LOX) activation was involved in reactive oxygen species (ROS) generation, extensive poly(ADP-ribose) polymerase (PARP) activation and neuronal death induced by glucose-deprivation, followed by glucose-reload (GD/R). The decrease of neuronal viability and accumulation of poly(ADP-ribose) induced by GD/R were prevented 3-aminobenzamide, a representative PARP inhibitor, demonstrating this treatment protocol caused the same oxidative stress with the previously reported one. The PARP activation, ROS generation and decrease of neuron viability induced by GD/R treatment were almost completely abolished by an extracellular zinc chelator, CaEDTA. p47(phox), a cytosolic component of NADPH oxidase was translocated the membrane fraction by GD/R, indicating its activation, but it did not generate detectable ROS. Surprisingly, pharmacological inhibition of NADPH oxidase with apocynin and AEBSF further decreased the decreased neuron viability induced by GD/R. On the other hand, AA861, a 12-LOX inhibitor, prevented ROS generation and decrease of neuron viability caused by GD/R. Interestingly, an antioxidant, N-acetyl-l-cysteine rescued the neurons from GD/R-induced oxidative stress, implying effectiveness of antioxidant administration. These findings suggested that activation of 12-LOX, but not NADPH oxidase, following to zinc release might play an important role in ROS generation and decrease of viability in GD/R-treated neurons.

  11. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  12. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  13. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  14. Algorithmic Processes for Increasing Design Efficiency.

    ERIC Educational Resources Information Center

    Terrell, William R.

    1983-01-01

    Discusses the role of algorithmic processes as a supplementary method for producing cost-effective and efficient instructional materials. Examines three approaches to problem solving in the context of developing training materials for the Naval Training Command: application of algorithms, quasi-algorithms, and heuristics. (EAO)

  15. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  16. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  17. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  18. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  19. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  20. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined.

  1. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined. PMID:21624834

  2. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  3. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates. PMID:27088667

  4. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  5. Process Design Manual for Land Treatment of Municipal Wastewater.

    ERIC Educational Resources Information Center

    Crites, R.; And Others

    This manual presents a procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are given emphasis. The basic unit operations and unit processes are discussed in detail, and the design concepts and criteria are presented. The manual includes design…

  6. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  7. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  8. The Use of Computer Graphics in the Design Process.

    ERIC Educational Resources Information Center

    Palazzi, Maria

    This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…

  9. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  10. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  11. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  12. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  13. Integration of MGDS design into the licensing process

    SciTech Connect

    1997-12-01

    This paper presents an overview of how the Mined Geologic Disposal System (MGDS) design for a potential repository is integrated into the licensing process. The integration process employs a two-told approach: (1) ensure that the MGDS design complies with applicable Nuclear Regulatory Commission (NRC) licensing requirements, and (2) ensure that the MGDS design is appropriately reflected in a license application that is acceptable to the NRC for performing acceptance and compliance reviews.

  14. Process-based design of dynamical biological systems

    NASA Astrophysics Data System (ADS)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  15. Process-based design of dynamical biological systems

    PubMed Central

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-01-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered. PMID:27686219

  16. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  17. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs.

  18. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. PMID:24616438

  19. Design of experiments in Biomedical Signal Processing Course.

    PubMed

    Li, Ling; Li, Bin

    2008-01-01

    Biomedical Signal Processing is one of the most important major subjects in Biomedical Engineering. The contents of Biomedical Signal Processing include the theories of digital signal processing, the knowledge of different biomedical signals, physiology and the ability of computer programming. Based on our past five years teaching experiences, in order to let students master the signal processing algorithm well, we found that the design of experiments following algorithm was very important. In this paper we presented the ideas and aims in designing the experiments. The results showed that our methods facilitated the study of abstractive signal processing algorithms and made understanding of biomedical signals in a simple way.

  20. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, S.D.

    1998-07-01

    The sequential manner in which materials and processes for a manufactured product are selected is inherently less than optimal. Designers` tendency to choose processes and materials with which they are familiar exacerbate this problem. A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach is presented.

  1. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  2. Nursing job process analysis from viewpoint of process design by job diagram.

    PubMed

    Dannoue, Hideo; Tsuru, Satoko; Munechika, Masahiko; Iizuka, Yoshinori

    2006-01-01

    Recently Japan demands more and more quality assurance in clinical practice. Several aspects of issues have been discussed to provide significant suggestions for nursing quality assurance. In the quality management field, Process Design, which is known to contribute to quality assurance, is an important frame. This study attempts to analyze the nursing job process from the viewpoint of process design. As a result, some knowledge on the nursing job process could be comprehended. Process analysis from the viewpoint of Process Design is considered significant in nursing practice and further improvement of its technique and application is a challenge for the future.

  3. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  4. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  5. The New Digital Engineering Design and Graphics Process.

    ERIC Educational Resources Information Center

    Barr, R. E.; Krueger, T. J.; Aanstoos, T. A.

    2002-01-01

    Summarizes the digital engineering design process using software widely available for the educational setting. Points out that newer technology used in the field is not used in engineering graphics education. (DDR)

  6. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  7. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  8. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  9. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  10. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  11. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  12. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  13. Applying the ID Process to the Guided Design Teaching Strategy.

    ERIC Educational Resources Information Center

    Coscarelli, William C.; White, Gregory P.

    1982-01-01

    Describes the application of the instructional development process to a teaching technique called Guided Design in a Production-Operations Management course. In Guided Design, students are self-instructed in course content and use class time to apply this knowledge to self-instruction; in-class problem-solving is stressed. (JJD)

  14. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  15. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  16. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi ); Tung, Yuanki )

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  17. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  18. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  19. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  20. Application of hazard assessment techniques in the CISF design process

    SciTech Connect

    Thornton, J.R.; Henry, T.

    1997-10-29

    The Department of Energy has submitted to the NRC staff for review a topical safety analysis report (TSAR) for a Centralized Interim Storage Facility (CISF). The TSAR will be used in licensing the CISF when and if a site is designated. CISF1 design events are identified based on thorough review of design basis events (DBEs) previously identified by dry storage system suppliers and licensees and through the application of hazard assessment techniques. A Preliminary Hazards Assessment (PHA) is performed to identify design events applicable to a Phase 1 non site specific CISF. A PHA is deemed necessary since the Phase 1 CISF is distinguishable from previous dry store applications in several significant operational scope and design basis aspects. In addition to assuring all design events applicable to the Phase 1 CISF are identified, the PHA served as an integral part of the CISF design process by identifying potential important to safety and defense in depth facility design and administrative control features. This paper describes the Phase 1 CISF design event identification process and summarizes significant PHA contributions to the CISF design.

  1. The start up as a phase of architectural design process.

    PubMed

    Castro, Iara Sousa; Lima, Francisco de Paula Antunes; Duarte, Francisco José de Castro Moura

    2012-01-01

    Alterations made in the architectural design can be considered as a continuous process, from its conception to the moment a built environment is already in use. This article focuses on the "moving phase", which is the initial moment of the environment occupation and the start-up of services. It aims to show that the continuity of ergonomics interventions during the "moving phase" or start up may reveal the built environment inadequacies; clearly showing needs not met by the design and allowing making instant decisions to solve non-foreseen problems. The results have revealed some lessons experienced by users during a critical stage not usually included in the design process.

  2. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  3. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  4. Incorporating manufacturability constraints into the design process of heterogeneous objects

    NASA Astrophysics Data System (ADS)

    Hu, Yuna; Blouin, Vincent Y.; Fadel, Georges M.

    2004-11-01

    Rapid prototyping (RP) technology, such as Laser Engineering Net Shaping (LENSTM), can be used to fabricate heterogeneous objects with gradient variations in material composition. These objects are generally characterized by enhanced functional performance. Past research on the design of such objects has focused on representation, modeling, and functional performance. However, the inherent constraints in RP processes, such as system capability and processing time, lead to heterogeneous objects that may not meet the designer's original intent. To overcome this situation, the research presented in this paper focuses on the identification and implementation of manufacturing constraints into the design process. A node-based finite element modeling technique is used for the representation and analysis and the multicriteria design problem corresponds to finding the nodal material compositions that minimize structural weight and maximize thermal performance. The optimizer used in this research is a real-valued Evolutionary Strategies (ES), which is well suited for this type of multi-modal problem. Two limitations of the LENS manufacturing process, which have an impact on the design process, are identified and implemented. One of them is related to the manufacturing time, which is considered as an additional criterion to be minimized in the design problem for a preselected tool path. A brake disc rotor made of two materials, aluminum for lightweight and steel for superior thermal characteristics, is used to illustrate the tradeoff between manufacturability and functionality.

  5. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  6. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals. PMID:16604700

  7. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  8. Natural gas operations: considerations on process transients, design, and control.

    PubMed

    Manenti, Flavio

    2012-03-01

    This manuscript highlights tangible benefits deriving from the dynamic simulation and control of operational transients of natural gas processing plants. Relevant improvements in safety, controllability, operability, and flexibility are obtained not only within the traditional applications, i.e. plant start-up and shutdown, but also in certain fields apparently time-independent such as the feasibility studies of gas processing plant layout and the process design of processes. Specifically, this paper enhances the myopic steady-state approach and its main shortcomings with respect to the more detailed studies that take into consideration the non-steady state behaviors. A portion of a gas processing facility is considered as case study. Process transients, design, and control solutions apparently more appealing from a steady-state approach are compared to the corresponding dynamic simulation solutions.

  9. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, Stephen D.; Knorovsky, Gerald A.

    2000-08-16

    A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach facilitates the product design and manufacturing process. Using a Windows-based computer video display and a data base of materials and their properties, the designer can ascertain the preferred composition of two parts based on various operating/environmental constraints such as load, temperature, lifetime, etc. Optimum joinder of the two parts may simultaneously be determined using a joining process data base based upon the selected composition of the components as well as the operating/environmental constraints.

  10. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  11. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  12. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  13. Design of a Pu-238 Waste Incineration Process

    SciTech Connect

    Charlesworth, D.L.

    2001-05-29

    Combustible Pu-238 waste is generated as a result of normal operation and decommissioning activity at the Savannah River Plant and is being retrievably stored there. As part of the long-term plan to process the stored waste and current waste in preparation for future disposition, a Pu-238 incineration process is being cold-tested at Savannah River Laboratory (SRL). The incineration process consists of a continuous-feed preparation system, a two-stage, electrically fired incinerator, and a filtration off-gas system. Process equipment has been designed, fabricated, and installed for nonradioactive testing and cold run-in. Design features to maximize the ability to remotely maintain the equipment were incorporated into the process. Interlock, alarm, and control functions are provided by a programmable controller. Cold testing is scheduled to be completed in 1986.

  14. Mould design and casting process improvement on vibrator shell

    NASA Astrophysics Data System (ADS)

    Zhang, Lipan; Fang, Ligao; Chen, Zhong; Song, Kai

    2011-12-01

    Vibrator shell is a part with complex structure. While the vibrator shell is designed and manufactured by traditional sand casting process, more than 80% castings are found the defects of porosity, shrinkage and pouring-shortage at the top. Aiming to the problems in traditional sand casting, this paper focused on the improvement of castings structure and the optimization of casting process. Designing process bar in the gate-channel region which is connected with the gate in castings is used to improve the castings structure, and low speed filling and solidification under high pressure are adopted to optimize the casting process which is finished by self-made four-column type hydraulic machine equipped. It can be seen that the castings quality can be greatly improved by process improvement.

  15. OSIRIS Multi-Object Spectroscopy: Mask Design Process

    NASA Astrophysics Data System (ADS)

    Gómez-Velarde, G.; García-Alvarez, D.; Cabrerra-Lavers, A.

    2016-10-01

    The OSIRIS (Optical System for Imaging and Low-Intermediate Resolution Integrated Spectroscopy) instrument at the 10.4 m GTC has offered a multi-object spectroscopic mode since March 2014. In this paper we describe the detailed process of designing a MOS mask for OSIRIS by using the Mask Designer Tool, and give some numbers on the accuracy of the mask manufacture achievable at the telescope for its scientific use.

  16. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  17. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  18. Air stripping VOCs from groundwater: Process design considerations

    SciTech Connect

    Ball, B.R.; Edwards, M.D. )

    1992-02-01

    Considerations for evaluating and designing the air stripping process are presented by case study. The case study involves the design of an air stripping process to remediate groundwater contaminated with volatile organic compounds (VOCs) at a National Priorities List site in Tacoma, WA. Design objectives included developing a tower with minimum volume and energy requirements while complying with discharge air and water quality standards. A two-phase resistance model using Onda Correlations to determine liquid- and gas-phase mass transfer coefficients was used to assist in the evaluation and design. Considerations for applying the two-phase resistance model to air stripping tower design are presented. The ability of the model to simulate process performance is demonstrated by comparison with actual data for 11 priority pollutant list VOCs evaluated during an onsite pilot study. Design procedures with which to develop a tower with minimum volume and energy requirements are described. Other considerations involving the evaluation of VOC emissions and the precipitation and buildup of inorganic constituents within the internal packing media are described.

  19. System design considerations for free-fall materials processing

    NASA Technical Reports Server (NTRS)

    Seidensticker, R. G.

    1974-01-01

    The design constraints for orbiting materials processing systems are dominated by the limitations of the flight vehicle/crew and not by the processes themselves. Although weight, size and power consumption are all factors in the design of normal laboratory equipment, their importance is increased orders of magnitude when the equipment must be used in an orbital facility. As a result, equipment intended for space flight may have little resemblance to normal laboratory apparatus although the function to be performed may be identical. The same considerations influence the design of the experiment itself. The processing requirements must be carefully understood in terms of basic physical parameters rather than defined in terms of equipment operation. Preliminary experiments and analysis are much more vital to the design of a space experiment than they are on earth where iterative development is relatively easy. Examples of these various considerations are illustrated with examples from the M518 and MA-010 systems. While these are specific systems, the conclusions apply to the design of flight materials processing systems both present and future.

  20. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  1. Improving Software Development Process through Economic Mechanism Design

    NASA Astrophysics Data System (ADS)

    Yilmaz, Murat; O'Connor, Rory V.; Collins, John

    We introduce the novel concept of applying economic mechanism design to software development process, and aim to find ways to adjust the incentives and disincentives of the software organization to align them with the motivations of the participants in order to maximize the delivered value of a software project. We envision a set of principles to design processes that allow people to be self motivated but constantly working toward project goals. The resulting economic mechanism will rely on game theoretic principles (i.e. Stackelberg games) for leveraging the incentives, goals and motivation of the participants in the service of project and organizational goals.

  2. Distributed processing techniques: interface design for interactive information sharing.

    PubMed

    Wagner, J R; Krumbholz, S D; Silber, L K; Aniello, A J

    1978-01-01

    The Information Systems Division of the University of Iowa Hospitals and Clinics has successfully designed and implemented a set of generalized interface data-handling routines that control message traffic between a satellite minicomputer in a clinical laboratory and a large main-frame computer. A special queue status inquiry transaction has also been developed that displays the current message-processing backlog and other system performance information. The design and operation of these programs are discussed in detail, with special emphasis on the message-queuing and verification techniques required in a distributed processing environment.

  3. Aerospace structural design process improvement using systematic evolutionary structural modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Robert Michael

    2000-10-01

    A multidisciplinary team tasked with an aircraft design problem must understand the problem requirements and metrics to produce a successful design. This understanding entails not only knowledge of what these requirements and metrics are, but also how they interact, which are most important (to the customer as well as to aircraft performance), and who in the organization can provide pertinent knowledge for each. In recent years, product development researchers and organizations have developed and successfully applied a variety of tools such as Quality Function Deployment (QFD) to coordinate multidisciplinary team members. The effectiveness of these methods, however, depends on the quality and fidelity of the information that team members can input. In conceptual aircraft design, structural information is of lower quality compared to aerodynamics or performance because it is based on experience rather than theory. This dissertation shows how advanced structural design tools can be used in a multidisciplinary team setting to improve structural information generation and communication through a systematic evolution of structural detail. When applied to conceptual design, finite element-based structural design tools elevate structural information to the same level as other computationally supported disciplines. This improved ability to generate and communicate structural information enables a design team to better identify and meet structural design requirements, consider producibility issues earlier, and evaluate structural concepts. A design process experiment of a wing structural layout in collaboration with an industrial partner illustrates and validates the approach.

  4. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  5. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  6. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  7. Integrating Science into Design Technology Projects: Using a Standard Model in the Design Process.

    ERIC Educational Resources Information Center

    Zubrowski, Bernard

    2002-01-01

    Fourth graders built a model windmill using a three-step process: (1) open exploration of designs; (2) application of a standard model incorporating features of suggested designs; and (3) refinement of preliminary models. The approach required math, science, and technology teacher collaboration and adequate time. (Contains 21 references.) (SK)

  8. Design Considerations for the Construction and Operation of Flour Milling Facilities. Part II: Process Design Considerations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Flour milling facilities have been the cornerstone of agricultural processing for centuries. Like most agri-industrial production facilities, flour milling facilities have a number of unique design requirements. Design information, to date, has been limited. In an effort to summarize state of the ...

  9. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  10. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  11. A formulation of metamodel implementation processes for complex systems design

    NASA Astrophysics Data System (ADS)

    Daberkow, Debora Daniela

    Complex systems design poses an interesting as well as demanding information management problem for system level integration and design. The high interconnectivity of disciplines combined with the specific knowledge and expertise in each of these calls for a system level view that is broad, as in spanning across all disciplines, while at the same time detailed enough to do the disciplinary knowledge justice. The treatment of this requires highly evolved information management and decision approaches, which result in design methodologies that can handle this high degree of complexity. The solution is to create models within the design process, which predict meaningful metrics representative of the various disciplinary analyses that can be quickly evaluated and thus serve in system level decision making and optimization. Such models approximate the physics-based analysis codes used in each of the disciplines and are called metamodels since effectively, they model the (physics-based) models on which the disciplinary analysis codes are based. The thesis formulates a new metamodel implementation process to be used in complex systems design, utilizing a Gaussian Process prediction method. It is based on a Bayesian probability and inference approach and as such returns a variance prediction along with the most likely value, thus giving an estimate also for the confidence in the prediction. Within this thesis, the applicability and appropriateness at the theoretical as well as practical level are investigated, and proof-of-concept implementations at the disciplinary and system levels are provided.

  12. Noise control, sound, and the vehicle design process

    NASA Astrophysics Data System (ADS)

    Donavan, Paul

    2005-09-01

    For many products, noise and sound are viewed as necessary evils that need to be dealt with in order to bring the product successfully to market. They are generally not product ``exciters'' although some vehicle manufacturers do tune and advertise specific sounds to enhance the perception of their products. In this paper, influencing the design process for the ``evils,'' such as wind noise and road noise, are considered in more detail. There are three ingredients to successfully dealing with the evils in the design process. The first of these is knowing how excesses in noise effects the end customer in a tangible manner and how that effects customer satisfaction and ultimately sells. The second is having and delivering the knowledge of what is required of the design to achieve a satisfactory or even better level of noise performance. The third ingredient is having the commitment of the designers to incorporate the knowledge into their part, subsystem or system. In this paper, the elements of each of these ingredients are discussed in some detail and the attributes of a successful design process are enumerated.

  13. INCORPORATING INDUSTRIAL ECOLOGY INTO HIERARCHICAL CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    Incorporating Industrial Ecology into Hierarchical Chemical Process Design: Determining Targets for the Exchange of Waste

    The exchange of waste to be used as a recycled feed has long been encouraged by practitioners of industrial ecology. Industrial ecology is a field t...

  14. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    ERIC Educational Resources Information Center

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  15. Processing and circuit design enhance a data converter's radiation tolerance

    SciTech Connect

    Heuner, R.; Zazzu, V.; Pennisi, L.

    1988-12-01

    Rad-hard CMOS/SOS processing has been applied to a novel comparator-inverter circuit design to develop 6 and 8-bit parallel (flash) ADC (analog-to-digital converter) circuits featuring high-speed operation, low power consumption, and total-dose radiation tolerances up to 1 Mrad(Si).

  16. Process Paradigms in Design and Composition: Affinities and Directions.

    ERIC Educational Resources Information Center

    Kostelnick, Charles

    1989-01-01

    Argues that comparing developments in the process approach to writing and the design methods movement sheds light on the evolution and future direction of the writing paradigm. Argues that sensitivity to the variety of writing tasks and social contexts is more effective than a single amorphous model. (RS)

  17. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  18. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product…

  19. GREENING OF OXIDATION CATALYSIS THROUGH IMPROVED CATALYST AND PROCESS DESIGN

    EPA Science Inventory


    Greening of Oxidation Catalysis Through Improved Catalysts and Process Design
    Michael A. Gonzalez*, Thomas Becker, and Raymond Smith

    United State Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory, 26 W...

  20. An Alternative Approach to the Process Design Course.

    ERIC Educational Resources Information Center

    McCready, Mark J.

    1989-01-01

    A course where students were required to choose projects and provide studies of the feasibility, consumer need, and process design is discussed. Other projects such as advertising campaigns used to encourage student creativity are discussed. The need to keep second semester seniors interested is stressed. (MVL)

  1. A Process Chart to Design Experiential Learning Projects

    ERIC Educational Resources Information Center

    Zhu, Suning; Wu, Yun; Sankar, Chetan S.

    2016-01-01

    A high-impact practice is to incorporate experiential learning projects when teaching difficulty subject matters so as to enhance students' understanding and interest in the course content. But, there is limited research on how to design and execute such projects. Therefore, we propose a framework based on the processes described by the Project…

  2. USING GENETIC ALGORITHMS TO DESIGN ENVIRONMENTALLY FRIENDLY PROCESSES

    EPA Science Inventory

    Genetic algorithm calculations are applied to the design of chemical processes to achieve improvements in environmental and economic performance. By finding the set of Pareto (i.e., non-dominated) solutions one can see how different objectives, such as environmental and economic ...

  3. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  4. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Aproximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use tis inform...

  5. Ingenuity in Action: Connecting Tinkering to Engineering Design Processes

    ERIC Educational Resources Information Center

    Wang, Jennifer; Werner-Avidon, Maia; Newton, Lisa; Randol, Scott; Smith, Brooke; Walker, Gretchen

    2013-01-01

    The Lawrence Hall of Science, a science center, seeks to replicate real-world engineering at the "Ingenuity in Action" exhibit, which consists of three open-ended challenges. These problems encourage children to engage in engineering design processes and problem-solving techniques through tinkering. We observed and interviewed 112…

  6. The Role of Dialogic Processes in Designing Career Expectations

    ERIC Educational Resources Information Center

    Bangali, Marcelline; Guichard, Jean

    2012-01-01

    This article examines the role played by dialogic processes in the designing or redesigning of future expectations during a career guidance intervention. It discusses a specific method ("Giving instruction to a double") developed and used during career counseling sessions with two recent doctoral graduates. It intends both to help them outline or…

  7. Developing 21st Century Process Skills through Project Design

    ERIC Educational Resources Information Center

    Yoo, Jeong-Ju; MacDonald, Nora M.

    2014-01-01

    The goal of this paper is to illustrate how the promotion of 21st Century process skills can be used to enhance student learning and workplace skill development: thinking, problem solving, collaboration, communication, leadership, and management. As an illustrative case, fashion merchandising and design students conducted research for a…

  8. Least cost process design for granular activated carbon adsorbers

    SciTech Connect

    Narbaitz, R.M.; Benedek, A.

    1983-10-01

    Although toxic organics may be removed from industrial effluents by activated carbon adsorbers, the cost of this process is relatively high. Also, adsorber design is complex because of the unsteady-state nature of the process and the numerous operational variables. A package of computer programs has been developed to help to minimise the ultimate cost of 4 types of column configurations. It determines the effect of treatment facility costs of different values for design and operational variables, such as empty bed contact time (EBCT), hydraulic loading, and column configurations. The results of a sample problem indicated that the optimum EBCT for all the column configurations was significantly higher than values typically used by designers.

  9. Design characteristics for facilities which process hazardous particulate

    SciTech Connect

    Abeln, S.P.; Creek, K.; Salisbury, S.

    1998-12-01

    Los Alamos National Laboratory is establishing a research and processing capability for beryllium. The unique properties of beryllium, including light weight, rigidity, thermal conductivity, heat capacity, and nuclear properties make it critical to a number of US defense and aerospace programs. Concomitant with the unique engineering properties are the health hazards associated with processing beryllium in a particulate form and the potential for worker inhalation of aerosolized beryllium. Beryllium has the lowest airborne standard for worker protection compared to all other nonradioactive metals by more than an order of magnitude. This paper describes the design characteristics of the new beryllium facility at Los Alamos as they relate to protection of the workforce. Design characteristics to be reviewed include; facility layout, support systems to minimize aerosol exposure and spread, and detailed review of the ventilation system design for general room air cleanliness and extraction of particulate at the source.

  10. Development of prilling process for biodegradable microspheres through experimental designs.

    PubMed

    Fabien, Violet; Minh-Quan, Le; Michelle, Sergent; Guillaume, Bastiat; Van-Thanh, Tran; Marie-Claire, Venier-Julienne

    2016-02-10

    The prilling process proposes a microparticle formulation easily transferable to the pharmaceutical production, leading to monodispersed and highly controllable microspheres. PLGA microspheres were used for carrying an encapsulated protein and adhered stem cells on its surface, proposing a tool for regeneration therapy against injured tissue. This work focused on the development of the production of PLGA microspheres by the prilling process without toxic solvent. The required production quality needed a complete optimization of the process. Seventeen parameters were studied through experimental designs and led to an acceptable production. The key parameters and mechanisms of formation were highlighted. PMID:26656302

  11. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    PubMed

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  12. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    PubMed

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  13. Advanced process control with design-based metrology

    NASA Astrophysics Data System (ADS)

    Yang, Hyunjo; Kim, Jungchan; Hong, Jongkyun; Yim, Donggyu; Kim, Jinwoong; Hasebe, Toshiaki; Yamamoto, Masahiro

    2007-03-01

    K1 factor for development and mass-production of memory devices has been decreased down to below 0.30 in recent years. Process technology has responded with extreme resolution enhancement technologies (RET) and much more complex OPC technologies than before. ArF immersion lithography is expected to remain the major patterning technology through under 35 nm node, where the degree of process difficulties and the sensitivity to process variations grow even higher. So, Design for manufacturing (DFM) is proposed to lower the degree of process difficulties and advanced process control (APC) is required to reduce the process variations. However, both DFM and APC need much feed-back from the wafer side such as hot spot inspection results and total CDU measurements at the lot, wafer, field and die level. In this work, we discuss a new design based metrology which can compare SEM image with CAD data and measure the whole CD deviations from the original layouts in a full die. It can provide the full information of hot spots and the whole CD distribution diagram of various transistors in peripheral regions as well as cell layout. So, it is possible to analyze the root cause of the CD distribution of some specific transistors or cell layout, such as OPC error, mask CDU, lens aberrations or etch process variation and so on. The applications of this new inspection tool will be introduced and APC using the analysis result will be presented in detail.

  14. Time-Course of Muscle Mass Loss, Damage, and Proteolysis in Gastrocnemius following Unloading and Reloading: Implications in Chronic Diseases

    PubMed Central

    Chacon-Cabrera, Alba; Lund-Palau, Helena; Gea, Joaquim; Barreiro, Esther

    2016-01-01

    Background Disuse muscle atrophy is a major comorbidity in patients with chronic diseases including cancer. We sought to explore the kinetics of molecular mechanisms shown to be involved in muscle mass loss throughout time in a mouse model of disuse muscle atrophy and recovery following immobilization. Methods Body and muscle weights, grip strength, muscle phenotype (fiber type composition and morphometry and muscle structural alterations), proteolysis, contractile proteins, systemic troponin I, and mitochondrial content were assessed in gastrocnemius of mice exposed to periods (1, 2, 3, 7, 15 and 30 days) of non-invasive hindlimb immobilization (plastic splint, I cohorts) and in those exposed to reloading for different time-points (1, 3, 7, 15, and 30 days, R cohorts) following a seven-day period of immobilization. Groups of control animals were also used. Results Compared to non-exposed controls, muscle weight, limb strength, slow- and fast-twitch cross-sectional areas, mtDNA/nDNA, and myosin content were decreased in mice of I cohorts, whereas tyrosine release, ubiquitin-proteasome activity, muscle injury and systemic troponin I levels were increased. Gastrocnemius reloading following splint removal improved muscle mass loss, strength, fiber atrophy, injury, myosin content, and mtDNA/nDNA, while reducing ubiquitin-proteasome activity and proteolysis. Conclusions A consistent program of molecular and cellular events leading to reduced gastrocnemius muscle mass and mitochondrial content and reduced strength, enhanced proteolysis, and injury, was seen in this non-invasive mouse model of disuse muscle atrophy. Unloading of the muscle following removal of the splint significantly improved the alterations seen during unloading, characterized by a specific kinetic profile of molecular events involved in muscle regeneration. These findings have implications in patients with chronic diseases including cancer in whom physical activity may be severely compromised. PMID

  15. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design.

  16. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design. PMID:27492085

  17. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  18. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  19. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  20. Computer aided microbial safety design of food processes.

    PubMed

    Schellekens, M; Martens, T; Roberts, T A; Mackey, B M; Nicolaï, B M; Van Impe, J F; De Baerdemaeker, J

    1994-12-01

    To reduce the time required for product development, to avoid expensive experimental tests, and to quantify safety risks for fresh products and the consequence of processing there is a growing interest in computer aided food process design. This paper discusses the application of hybrid object-oriented and rule-based expert system technology to represent the data and knowledge of microbial experts and food engineers. Finite element models for heat transfer calculation routines, microbial growth and inactivation models and texture kinetics are combined with food composition data, thermophysical properties, process steps and expert knowledge on type and quantity of microbial contamination. A prototype system has been developed to evaluate changes in food composition, process steps and process parameters on microbiological safety and textual quality of foods.

  1. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  2. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  3. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  4. Need for image processing in infrared camera design

    NASA Astrophysics Data System (ADS)

    Allred, Lloyd G.; Jones, Martin H.

    2000-03-01

    While the value of image processing has been longly recognized, this is usually done during post-processing. For scientific application, the presence of large noise errors, data drop out, and dead sensors would invalidate any conclusion made from the data until noise-removal and sensor calibration has been accomplished. With the growing need for ruggedized, real-time image acquisition systems, including applications to automotive and aerospace, post processing may not be an option. With post processing, the operator does not have the opportunity to view the cleaned-up image. Focal plane arrays are plagued by bad sensors, high manufacturing costs, and low yields, often forcing a six digit cost tag. Perhaps infrared camera design is too serious an issue to leave to the camera manufacturers. Alternative camera designs using a single spinning mirror can yield perfect infrared images at rates up to 12000 frames per second using a fraction of the hardware in the current focal-plane arrays. Using a 768 X 5 sensor array, redundant 2048 X 768 images are produced by each row of the sensor array. Sensor arrays with flawed sensors would no longer need to be discarded because data from dead sensors can be discarded, thus increasing manufacturing yields and reducing manufacturing costs. Furthermore, very rapid image processing chips are available, allowing for real-time morphological image processing (including real-time sensor calibration), thus significantly increasing thermal precision, making thermal imaging amenable for an increased variety of applications.

  5. A Review of the Design Process for Implantable Orthopedic Medical Devices

    PubMed Central

    Aitchison, G.A; Hukins, D.W.L; Parry, J.J; Shepherd, D.E.T; Trotman, S.G

    2009-01-01

    The design process for medical devices is highly regulated to ensure the safety of patients. This paper will present a review of the design process for implantable orthopedic medical devices. It will cover the main stages of feasibility, design reviews, design, design verification, manufacture, design validation, design transfer and design changes. PMID:19662153

  6. Design of the HTGR for process heat applications

    SciTech Connect

    Vrable, D.L.; Quade, R.N.

    1980-05-01

    This paper discusses a design study of an advanced 842-MW(t) HTGR with a reactor outlet temperature of 850/sup 0/C (1562/sup 0/F), coupled with a chemical process whose product is hydrogen (or a mixture of hydrogen and carbon monoxide) generated by steam reforming of a light hydrocarbon mixture. This paper discusses the plant layout and design for the major components of the primary and secondary heat transfer systems. Typical parametric system study results illustrate the capability of a computer code developed to model the plant performance and economics.

  7. Operation and design of selected industrial process heat field tests

    SciTech Connect

    Kearney, D. W.

    1981-02-01

    The DOE program of solar industrial process heat field tests has shown solar energy to be compatible with numerous industrial needs. Both the operational projects and the detailed designs of systems that are not yet operational have resulted in valuable insights into design and hardware practice. Typical of these insights are the experiences discussed for the four projects reviewed. Future solar IPH systems should benefit greatly not only from the availability of present information, but also from the wealth of operating experience from projects due to start up in 1981.

  8. A Taguchi study of the aeroelastic tailoring design process

    NASA Technical Reports Server (NTRS)

    Bohlmann, Jonathan D.; Scott, Robert C.

    1991-01-01

    A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.

  9. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  10. Remote Maintenance Design Guide for Compact Processing Units

    SciTech Connect

    Draper, J.V.

    2000-07-13

    Oak Ridge National Laboratory (ORNL) Robotics and Process Systems (RPSD) personnel have extensive experience working with remotely operated and maintained systems. These systems require expert knowledge in teleoperation, human factors, telerobotics, and other robotic devices so that remote equipment may be manipulated, operated, serviced, surveyed, and moved about in a hazardous environment. The RPSD staff has a wealth of experience in this area, including knowledge in the broad topics of human factors, modular electronics, modular mechanical systems, hardware design, and specialized tooling. Examples of projects that illustrate and highlight RPSD's unique experience in remote systems design and application include the following: (1) design of a remote shear and remote dissolver systems in support of U.S. Department of Energy (DOE) fuel recycling research and nuclear power missions; (2) building remotely operated mobile systems for metrology and characterizing hazardous facilities in support of remote operations within those facilities; (3) construction of modular robotic arms, including the Laboratory Telerobotic Manipulator, which was designed for the National Aeronautics and Space Administration (NASA) and the Advanced ServoManipulator, which was designed for the DOE; (4) design of remotely operated laboratories, including chemical analysis and biochemical processing laboratories; (5) construction of remote systems for environmental clean up and characterization, including underwater, buried waste, underground storage tank (UST) and decontamination and dismantlement (D&D) applications. Remote maintenance has played a significant role in fuel reprocessing because of combined chemical and radiological contamination. Furthermore, remote maintenance is expected to play a strong role in future waste remediation. The compact processing units (CPUs) being designed for use in underground waste storage tank remediation are examples of improvements in systems processing

  11. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  12. Robust process design and springback compensation of a decklid inner

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaojing; Grimm, Peter; Carleer, Bart; Jin, Weimin; Liu, Gang; Cheng, Yingchao

    2013-12-01

    Springback compensation is one of the key topics in current die face engineering. The accuracy of the springback simulation, the robustness of method planning and springback are considered to be the main factors which influences the effectiveness of springback compensation. In the present paper, the basic principles of springback compensation are presented firstly. These principles consist of an accurate full cycle simulation with final validation setting and the robust process design and optimization are discussed in detail via an industrial example, a decklid inner. Moreover, an effective compensation strategy is put forward based on the analysis of springback and the simulation based springback compensation is introduced in the phase of process design. In the end, the final verification and comparison in tryout and production is given in this paper, which verified that the methodology of robust springback compensation is effective during the die development.

  13. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  14. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-03-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  15. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-01-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  16. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  17. Using Process Visualizations to Validate Electronic Form Design

    PubMed Central

    Marquard, Jenna L.; Mei, Yi You

    2010-01-01

    Electronic reporting systems have the potential to support health care quality improvement initiatives across varied health care settings, specifically in low-technology settings such as long-term residential care facilities (LTRCFs). Yet, these organizations face financial barriers to implementing such systems and the LTRCF workforce is generally not as technology-ready as larger organizations’ workforces. Electronic reporting systems implemented in these settings must therefore be inexpensive and easy-to-use. This paper outlines a novel technique – process visualization – for systematically assessing the order in which users complete electronic forms, an inexpensively-developed patient falls reporting form in this case. These visualizations can help designers uncover usage patterns not evident via other usability methods. Based on this knowledge, designers can validate the design of the electronic forms, informing their subsequent redesign. PMID:21347028

  18. Improving Tools and Processes in Mechanical Design Collaboration

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2009-01-01

    Cooperative product development projects in the aerospace and defense industry are held hostage to high cost and risk due to poor alignment of collaborative design tools and processes. This impasse can be broken if companies will jointly develop implementation approaches and practices in support of high value working arrangements. The current tools can be used to better advantage in many situations and there is reason for optimism that tool vendors will provide significant support.

  19. Energy codes and the building design process: Opportunities for improvement

    SciTech Connect

    Sandahl, L.J.; Shankle, D.L.; Rigler, E.J.

    1994-05-01

    The Energy Policy Act (EPAct), passed by Congress in 1992, requires states to adopt building energy codes for new commercial buildings that meet or exceed the American Society of Heating, Refrigerating, and Air Conditioning Engineers (ASHRAE) and Illuminating Engineers Society of North America (IES) Standard 90.1-1989 by October 24, 1994. In response to EPAct many states will be adopting a state-wide energy code for the first time. Understanding the role of stakeholders in the building design process is key to the successful implementation of these codes. In 1993, the Pacific Northwest Laboratory (PNL) conducted a survey of architects and designers to determine how much they know about energy codes, to what extent energy-efficiency concerns influence the design process, and how they convey information about energy-efficient designs and products to their clients. Findings of the PNL survey, together with related information from a survey by the American Institute of Architects (AIA) and other reports, are presented in this report. This information may be helpful for state and utility energy program managers and others who will be involved in promoting the adoption and implementation of state energy codes that meet the requirements of EPAct.

  20. Space Station Freedom pressurized element interior design process

    NASA Technical Reports Server (NTRS)

    Hopson, George D.; Aaron, John; Grant, Richard L.

    1990-01-01

    The process used to develop the on-orbit working and living environment of the Space Station Freedom has some very unique constraints and conditions to satisfy. The goal is to provide maximum efficiency and utilization of the available space, in on-orbit, zero G conditions that establishes a comfortable, productive, and safe working environment for the crew. The Space Station Freedom on-orbit living and working space can be divided into support for three major functions: (1) operations, maintenance, and management of the station; (2) conduct of experiments, both directly in the laboratories and remotely for experiments outside the pressurized environment; and (3) crew related functions for food preparation, housekeeping, storage, personal hygiene, health maintenance, zero G environment conditioning, and individual privacy, and rest. The process used to implement these functions, the major requirements driving the design, unique considerations and constraints that influence the design, and summaries of the analysis performed to establish the current configurations are described. Sketches and pictures showing the layout and internal arrangement of the Nodes, U.S. Laboratory and Habitation modules identify the current design relationships of the common and unique station housekeeping subsystems. The crew facilities, work stations, food preparation and eating areas (galley and wardroom), and exercise/health maintenance configurations, waste management and personal hygiene area configuration are shown. U.S. Laboratory experiment facilities and maintenance work areas planned to support the wide variety and mixtures of life science and materials processing payloads are described.

  1. Design of multichannel image processing on the Space Solar Telescope

    NASA Astrophysics Data System (ADS)

    Zhang, Bin

    2000-07-01

    The multi-channel image processing system on the Space Solar Telescope (SST) is described in this paper. This system is main part of science data unit (SDU), which is designed for dealing with the science data from every payload on the SST. First every payload on the SST and its scientific objective are introduced. They are main optic telescope, four soft X- ray telescopes, an H-alpha and white light (full disc) telescope, a coronagraph, a wide band X-ray and Gamma-ray spectrometer, and a solar and interplanetary radio spectrometer. Then the structure of SDU is presented. In this part, we discuss the hardware and software structure of SDU, which is designed for multi-payload. The science data scream of every payload is summarized, too. Solar magnetic and velocity field processing that occupies more than 90% of the data processing of SDU is discussed, which includes polarizing unit, image receiver and image adding unit. Last the plan of image data compression and mass memory that is designed for science data storage are presented.

  2. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  3. Making an Aquarium Environment Interactive: A Design Research Analysis of Exhibit Design Processes

    NASA Astrophysics Data System (ADS)

    Hanshumaker, William

    The purpose of this study was to investigate the development of an interactive aquarium design motivated by the employment of an innovative technology used in scientific research. The study was informed by research on free-choice learning describing the effects of interactive devices on visitor learning, engagement, and attitudes. The researcher used design research methods to conduct multiple iterations of aquarium environment modifications. Observation data of visitor interactions were analyzed in the development of three different aquarium environments. The researcher used survey, interview, and observation data to study visitor interactions in the three contrasting aquarium environments. Results describe exhibit factors associated with visitor behaviors using the scientific instrument and social or individual interactions in the exhibit environments. Results also present an analysis of design processes that were shaped by data on desired visitor interactions and adult learning. Through design research methods, this study contributes to theory of exhibit design for visitor engagement and learning.

  4. ArF processing of 90-nm design rule lithography achieved through enhanced thermal processing

    NASA Astrophysics Data System (ADS)

    Kagerer, Markus; Miller, Daniel; Chang, Wayne; Williams, Daniel J.

    2006-03-01

    As the lithography community has moved to ArF processing on 300 mm wafers for 90 nm design rules the process characterization of the components of variance continues to highlight the thermal requirements for the post exposure bake (PEB) processing step. In particular as the thermal systems have become increasingly uniform, the transient behavior of the thermal processing system has received the focus of attention. This paper demonstrates how a newly designed and patented thermal processing system was optimized for delivering improved thermal uniformity during a typical 90 second PEB processing cycle, rather than being optimized for steady state performance. This was accomplished with the aid of a wireless temperature measurement wafer system for obtaining real time temperature data and by using a response surface model (RSM) experimental design for optimizing parameters of the temperature controller of the thermal processing system. The new units were field retrofitted seamlessly in <2 days at customer sites without disruption to process recipes or flows. After evaluating certain resist parameters such as PEB temperature sensitivity and post exposure delay (PED) - stability of the baseline process, the new units were benchmarked against the previous PEB plates by processing a split lot experiment. Additional hardware characterization included environmental factors such as air velocity in the vicinity of the PEB plates and transient time between PEB and chill plate. At the completion of the optimization process, the within wafer CD uniformity displayed a significant improvement when compared to the previous hardware. The demonstrated within wafer CD uniformity improved by 27% compared to the initial hardware and baseline process. ITRS requirements for the 90 nm node were exceeded.

  5. Development of the Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Gruber, Christopher R.

    2004-01-01

    The aerodynamic development of an engine inlet requires a comprehensive program of both wind tunnel testing and Computational Fluid Dynamics (CFD) simulations. To save time and resources, much "testing" is done using CFD before any design ever enters a wind tunnel. The focus of my project this summer is on CFD analysis tool development. In particular, I am working to further develop the capabilities of the Planar Inlet Design and Analysis Process (PINDAP). "PINDAP" is a collection of computational tools that allow for efficient and accurate design and analysis of the aerodynamics about and through inlets that can make use of a planar (two-dimensional or axisymmetric) geometric and flow assumption. PINDAP utilizes the WIND CFD flow solver, which is capable of simulating the turbulent, compressible flow field. My project this summer is a continuation of work that I performed for two previous summers. Two years ago, I used basic features of the PINDAP to design a Mach 5 hypersonic scramjet engine inlet and to demonstrate the feasibility of the PINDAP. The following summer, I worked to develop its geometry and grid generation capabilities to include subsonic and supersonic inlets, complete bodies and cowls, conic leading and trailing edges, as well as airfoils. These additions allowed for much more design flexibility when using the program.

  6. Moving bed biofilm reactor technology: process applications, design, and performance.

    PubMed

    McQuarrie, James P; Boltz, Joshua P

    2011-06-01

    The moving bed biofilm reactor (MBBR) can operate as a 2- (anoxic) or 3-(aerobic) phase system with buoyant free-moving plastic biofilm carriers. These systems can be used for municipal and industrial wastewater treatment, aquaculture, potable water denitrification, and, in roughing, secondary, tertiary, and sidestream applications. The system includes a submerged biofilm reactor and liquid-solids separation unit. The MBBR process benefits include the following: (1) capacity to meet treatment objectives similar to activated sludge systems with respect to carbon-oxidation and nitrogen removal, but requires a smaller tank volume than a clarifier-coupled activated sludge system; (2) biomass retention is clarifier-independent and solids loading to the liquid-solids separation unit is reduced significantly when compared with activated sludge systems; (3) the MBBR is a continuous-flow process that does not require a special operational cycle for biofilm thickness, L(F), control (e.g., biologically active filter backwashing); and (4) liquid-solids separation can be achieved with a variety of processes, including conventional and compact high-rate processes. Information related to system design is fragmented and poorly documented. This paper seeks to address this issue by summarizing state-of-the art MBBR design procedures and providing the reader with an overview of some commercially available systems and their components. PMID:21751715

  7. Moving bed biofilm reactor technology: process applications, design, and performance.

    PubMed

    McQuarrie, James P; Boltz, Joshua P

    2011-06-01

    The moving bed biofilm reactor (MBBR) can operate as a 2- (anoxic) or 3-(aerobic) phase system with buoyant free-moving plastic biofilm carriers. These systems can be used for municipal and industrial wastewater treatment, aquaculture, potable water denitrification, and, in roughing, secondary, tertiary, and sidestream applications. The system includes a submerged biofilm reactor and liquid-solids separation unit. The MBBR process benefits include the following: (1) capacity to meet treatment objectives similar to activated sludge systems with respect to carbon-oxidation and nitrogen removal, but requires a smaller tank volume than a clarifier-coupled activated sludge system; (2) biomass retention is clarifier-independent and solids loading to the liquid-solids separation unit is reduced significantly when compared with activated sludge systems; (3) the MBBR is a continuous-flow process that does not require a special operational cycle for biofilm thickness, L(F), control (e.g., biologically active filter backwashing); and (4) liquid-solids separation can be achieved with a variety of processes, including conventional and compact high-rate processes. Information related to system design is fragmented and poorly documented. This paper seeks to address this issue by summarizing state-of-the art MBBR design procedures and providing the reader with an overview of some commercially available systems and their components.

  8. System design and performances of ASTER Level-1 data processing

    NASA Astrophysics Data System (ADS)

    Nishida, Sumiyuki; Hachiya, Jun; Matsumoto, Ken; Fujisada, Hiroyuki; Kato, Masatane

    1998-12-01

    ASTER is a multispectral imager which covers wide spectral region from visible to thermal infrared with 14 spectral bands, and will fly on EOS-AM1 in 1999. To meet this wide spectral coverage, ASTER has three optical sensing subsystems (multi-telescope system), VNIR, SWIR and TIR. This multi- telescope configuration requires highly refined ground processing for the generation of Level-1 data products that are radiometrically calibrated and geometrically corrected. A prototype Level-1 processing software system is developed to satisfy these requirements. System design concept adopted includes; (1) 'Automatic Processing,' (2)'ALL-IN-ONE-CONCEPT' in which the processing is carried out using information included in Level-0 data product only, (3) 'MODULE INDEPENDENCE' in which only process control module independently control other modules to change any operational conditions. (4) 'FLEXIBILITY' in which important operation parameters are set from an external component to make the processing condition change easier. The adaptability and the performance of the developed software system are evaluated using simulation data.

  9. Low-cost EUV collector development: design, process, and fabrication

    NASA Astrophysics Data System (ADS)

    Venables, Ranju D.; Goldstein, Michael; Engelhaupt, Darell; Lee, Sang H.; Panning, Eric M.

    2007-03-01

    Cost of ownership (COO) is an area of concern that may limit the adoption and usage of Extreme Ultraviolet Lithography (EUVL). One of the key optical components that contribute to the COO budget is the collector. The collectors being fabricated today are based on existing x-ray optic design and fabrication processes. The main contributors to collector COO are fabrication cost and lifetime. We present experimental data and optical modeling to demonstrate a roadmap for optimized efficiency and a possible approach for significant reduction in collector COO. Current state of the art collectors are based on a Wolter type-1 design and have been adapted from x-ray telescopes. It uses a long format that is suitable for imaging distant light sources such as stars. As applied to industrial equipment and very bright nearby sources, however, a Wolter collector tends to be expensive and requires significant debris shielding and integrated cooling solutions due to the source proximity and length of the collector shells. Three collector concepts are discussed in this work. The elliptical collector that has been used as a test bed to demonstrate alternative cost effective fabrication method has been optimized for collection efficiency. However, this fabrication method can be applied to other optical designs as well. The number of shells and their design may be modified to increase the collection efficiency and to accommodate different EUV sources The fabrication process used in this work starts with a glass mandrel, which is elliptical on the inside. A seed layer is coated on the inside of the glass mandrel, which is then followed by electroplating nickel. The inside/exposed surface of the electroformed nickel is then polished to meet the figure and finish requirements for the particular shell and finally coated with Ru or a multilayer film depending on the angle of incidence of EUV light. Finally the collector shell is released from the inside surface of the mandrel. There are

  10. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  11. Safeguards design strategies: designing and constructing new uranium and plutonium processing facilities in the United States

    SciTech Connect

    Scherer, Carolynn P; Long, Jon D

    2010-09-28

    In the United States, the Department of Energy (DOE) is transforming its outdated and oversized complex of aging nuclear material facilities into a smaller, safer, and more secure National Security Enterprise (NSE). Environmental concerns, worker health and safety risks, material security, reducing the role of nuclear weapons in our national security strategy while maintaining the capability for an effective nuclear deterrence by the United States, are influencing this transformation. As part of the nation's Uranium Center of Excellence (UCE), the Uranium Processing Facility (UPF) at the Y-12 National Security Complex in Oak Ridge, Tennessee, will advance the U.S.'s capability to meet all concerns when processing uranium and is located adjacent to the Highly Enriched Uranium Materials Facility (HEUMF), designed for consolidated storage of enriched uranium. The HEUMF became operational in March 2010, and the UPF is currently entering its final design phase. The designs of both facilities are for meeting anticipated security challenges for the 21st century. For plutonium research, development, and manufacturing, the Chemistry and Metallurgy Research Replacement (CMRR) building at the Los Alamos National Laboratory (LANL) in Los Alamos, New Mexico is now under construction. The first phase of the CMRR Project is the design and construction of a Radiological Laboratory/Utility/Office Building. The second phase consists of the design and construction of the Nuclear Facility (NF). The National Nuclear Security Administration (NNSA) selected these two sites as part of the national plan to consolidate nuclear materials, provide for nuclear deterrence, and nonproliferation mission requirements. This work examines these two projects independent approaches to design requirements, and objectives for safeguards, security, and safety (3S) systems as well as the subsequent construction of these modern processing facilities. Emphasis is on the use of Safeguards-by-Design (SBD

  12. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-01

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  13. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-01

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level. PMID:12141993

  14. Design of educational artifacts as support to learning process.

    PubMed

    Resende, Adson Eduardo; Vasconcelos, Flávio Henrique

    2012-01-01

    The aim of this paper is to identify utilization schemes developed by students and teachers in their interaction with educational workstations in the electronic measurement and instrumentation laboratory at the Department of Electrical Engineering in the Federal University of Minas Gerais (UFMG), Brazil. After that, these schemes were used to design a new workstation. For this, it was important to bear in mind that the mentioned artifacts contain two key characteristics: (1) one from the designers themselves, resulting from their experience and their technical knowledge of what they are designing and (2) the experience from users and the means through which they take advantage of and develop these artifacts, in turn rendering them appropriate to perform the proposed task - the utilization schemes developed in the process of mediation between the user and the artifact. The satisfactory fusion of these two points makes these artifacts a functional unit - the instruments. This research aims to demonstrate that identifying the utilization schemes by taking advantage of user experience and incorporating this within the design, facilitates its appropriation and, consequently, its efficiency as an instrument of learning.

  15. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  16. Dynamic simulation for IGCC process and control design

    SciTech Connect

    Depew, C.; Martinez, A.; Collodi, G.; Meloni, R.

    1998-01-01

    Detailed dynamic simulation analysis is a valuable tool that increases the understanding of unit interactions and control system performance in a complex integrated gasification combined-cycle (IGCC) plant. The Sarlux integrated gasification combined cycle (IGCC) plant must simultaneously satisfy electrical power and refinery hydrogen and steam demands (trigeneration gasification). The plant`s gasifier, heat recovery, sulfur removal, hydrogen recovery and steam power generation units are highly integrated and require coordinated control. In this study, dynamic simulation provides insights into the behavior of the process and combined cycle units during normal and upset conditions. The dynamic simulation is used to design a control system that drives the gasifiers to satisfy power, steam and hydrogen demands before a load change or upset is detected by the syngas pressure controller. At the study conclusion, the model will demonstrate how the IGCC plant will respond to the contractual maximum load change rate and process upsets. The study tests the basic process and control system design during the project engineering phase to minimize startup troubleshooting and expensive field changes.

  17. Waste receiving and processing plant control system; system design description

    SciTech Connect

    LANE, M.P.

    1999-02-24

    The Plant Control System (PCS) is a heterogeneous computer system composed of numerous sub-systems. The PCS represents every major computer system that is used to support operation of the Waste Receiving and Processing (WRAP) facility. This document, the System Design Description (PCS SDD), includes several chapters and appendices. Each chapter is devoted to a separate PCS sub-system. Typically, each chapter includes an overview description of the system, a list of associated documents related to operation of that system, and a detailed description of relevant system features. Each appendice provides configuration information for selected PCS sub-systems. The appendices are designed as separate sections to assist in maintaining this document due to frequent changes in system configurations. This document is intended to serve as the primary reference for configuration of PCS computer systems. The use of this document is further described in the WRAP System Configuration Management Plan, WMH-350, Section 4.1.

  18. Process design of press hardening with gradient material property influence

    SciTech Connect

    Neugebauer, R.; Schieck, F.; Rautenstrauch, A.

    2011-05-04

    Press hardening is currently used in the production of automotive structures that require very high strength and controlled deformation during crash tests. Press hardening can achieve significant reductions of sheet thickness at constant strength and is therefore a promising technology for the production of lightweight and energy-efficient automobiles. The manganese-boron steel 22MnB5 have been implemented in sheet press hardening owing to their excellent hot formability, high hardenability, and good temperability even at low cooling rates. However, press-hardened components have shown poor ductility and cracking at relatively small strains. A possible solution to this problem is a selective increase of steel sheet ductility by press hardening process design in areas where the component is required to deform plastically during crash tests. To this end, process designers require information about microstructure and mechanical properties as a function of the wide spectrum of cooling rates and sequences and austenitizing treatment conditions that can be encountered in production environments. In the present work, a Continuous Cooling Transformation (CCT) diagram with corresponding material properties of sheet steel 22MnB5 was determined for a wide spectrum of cooling rates. Heating and cooling programs were conducted in a quenching dilatometer. Motivated by the importance of residual elasticity in crash test performance, this property was measured using a micro-bending test and the results were integrated into the CCT diagrams to complement the hardness testing results. This information is essential for the process design of press hardening of sheet components with gradient material properties.

  19. Design and programming of systolic array cells for signal processing

    SciTech Connect

    Smith, R.A.W.

    1989-01-01

    This thesis presents a new methodology for the design, simulation, and programming of systolic arrays in which the algorithms and architecture are simultaneously optimized. The algorithms determine the initial architecture, and simulation is used to optimize the architecture. The simulator provides a register-transfer level model of a complete systolic array computation. To establish the validity of this design methodology two novel programmable systolic array cells were designed and programmed. The cells were targeted for applications in high-speed signal processing and associated matrix computations. A two-chip programmable systolic array cell using a 16-bit multiplier-accumulator chip and a semi-custom VLSI controller chip was designed and fabricated. A low chip count allows large arrays to be constructed, but the cell is flexible enough to be a building-block for either one- or two-dimensional systolic arrays. Another more flexible and powerful cell using a 32-bit floating-point processor and a second VLSI controller chip was also designed. It contains several architectural features that are unique in a systolic array cell: (1) each instruction is 32 bits, yet all resources can be updated every cycle, (2) two on-chip interchangeable memories are used, and (3) one input port can be used as either a global or local port. The key issues involved in programming the cells are analyzed in detail. A set of modules is developed which can be used to construct large programs in an effective manner. The utility of this programming approach is demonstrated with several important examples.

  20. Design of experiments for thermal protection system process optimization

    NASA Astrophysics Data System (ADS)

    Longani, Hans R.

    2000-01-01

    Solid Rocket Booster (SRB) structures were protected from heating due to aeroshear, radiation and plume impingement by a Thermal Protection System (TPS) known as Marshall Sprayable Ablative (MSA-2). MSA-2 contains Volatile Organic Compounds (VOCs) which due to strict environmental legislation was eliminated. MSA-2 was also classified as hazardous waste, which makes the disposal very costly. Marshall Convergent Coating (MCC-1) replaced MSA-2, and eliminated the use of solvents by delivering the dry filler materials and the fluid resin system to a patented spray gun which utilizes Convergent Spray Technologies spray process. The selection of TPS material was based on risk assessment, performance comparisons, processing, application and cost. Design of Experiments technique was used to optimize the spraying parameters. .

  1. Design and Process Considerations for a Tunneling Tip Accelerometer

    NASA Technical Reports Server (NTRS)

    Paul M. Zavracky, Bob McClelland, Keith Warner, Neil Sherman, Frank Hartley

    1995-01-01

    In this paper, we discuss issues related to the fabrication of a bulk micromachined single axis accelerometer. The accelerometer is designed to have a full scale range of ten millig and a sensitivity of tens of nanog. During the process, three distinctly different die are fabricated. These are subsequently assembled using an ally bonding technique. During the bonding operation, electrical contacts are made between layers. The accelerometer is controlled by electrostatic force plates above and below the proof mass. The lower electrode has a dual role. In operation, it provides a necessary control electrode. When not in operation, it is used to clamp the proof mass and prevents its motion. Results of the fabrication process and initial testing of the clamping function are reported.

  2. Thermoplastics as engineering materials: The mechanics, materials, design, processing link

    SciTech Connect

    Stokes, V.K.

    1995-10-01

    While the use of plastics has been growing at a significant pace because of weight reduction, ease of fabrication of complex shapes, and cost reduction resulting from function integration, the engineering applications of plastics have only become important in the past fifteen years. An inadequate understanding of the mechanics issues underlying the close coupling among the design, the processing (fabrication), and the assembly with these materials is a barrier to their use in structural applications. Recent progress on some issues relating to the engineering uses of plastics is surveyed, highlighting the need for a better understanding of plastics and how processing affects the performance of plastic parts. Topics addressed include the large deformation behavior of ductile resins, fiber orientation in chopped-fiber filled materials, structural foams, random glass mat composites, modeling of thickness distributions in blow-molded and thermoformed parts, dimensional stability (shrinkage, warpage, and residual stresses) in injection-molded parts, and welding of thermoplastics.

  3. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-11-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H{sub 2} mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO{sub x} (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  4. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-01-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H[sub 2] mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO[sub x] (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  5. [Design of an HACCP program for a cocoa processing facility].

    PubMed

    López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar

    2012-12-01

    The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well PMID:24020255

  6. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  7. Design, processing, and testing of lsi arrays for space station

    NASA Technical Reports Server (NTRS)

    Lile, W. R.; Hollingsworth, R. J.

    1972-01-01

    The design of a MOS 256-bit Random Access Memory (RAM) is discussed. Technological achievements comprise computer simulations that accurately predict performance; aluminum-gate COS/MOS devices including a 256-bit RAM with current sensing; and a silicon-gate process that is being used in the construction of a 256-bit RAM with voltage sensing. The Si-gate process increases speed by reducing the overlap capacitance between gate and source-drain, thus reducing the crossover capacitance and allowing shorter interconnections. The design of a Si-gate RAM, which is pin-for-pin compatible with an RCA bulk silicon COS/MOS memory (type TA 5974), is discussed in full. The Integrated Circuit Tester (ICT) is limited to dc evaluation, but the diagnostics and data collecting are under computer control. The Silicon-on-Sapphire Memory Evaluator (SOS-ME, previously called SOS Memory Exerciser) measures power supply drain and performs a minimum number of tests to establish operation of the memory devices. The Macrodata MD-100 is a microprogrammable tester which has capabilities of extensive testing at speeds up to 5 MHz. Beam-lead technology was successfully integrated with SOS technology to make a simple device with beam leads. This device and the scribing are discussed.

  8. Fiber optic sensor design for chemical process and environmental monitoring

    NASA Astrophysics Data System (ADS)

    Mahendran, R. S.; Harris, D.; Wang, L.; Machavaram, V. R.; Chen, R.; Kukureka, St. N.; Fernando, G. F.

    2007-07-01

    Cure monitoring is a term that is used to describe the cross-linking reactions in a thermosetting resin system. Advanced fiber reinforced composites are being used increasingly in a number of industrial sectors including aerospace, marine, sport, automotive and civil engineering. There is a general realization that the processing conditions that are used to manufacture the composites can have a major influence on its hot-wet mechanical properties. This paper is concerned with the design and demonstration of a number of sensor designs for in-situ cure monitoring of a model thermosetting resin system. Simple fixtures were constructed to enable a pair of cleaved optical fibers with a defined gap between the end-faces to be held in position. The resin system was introduced into this gap and the cure kinetics were followed by transmission infrared spectroscopy. A semi-empirical model was used to describe the cure process using the data obtained at different cure temperatures. The same sensor system was used to detect the ingress of moisture in the cured resin system.

  9. Integrating optical fabrication and metrology into the optical design process.

    PubMed

    Harvey, James E

    2015-03-20

    The recent validation of a generalized linear systems formulation of surface scatter theory and an analysis of image degradation due to surface scatter in the presence of aberrations has provided credence to the development of a systems engineering analysis of image quality as degraded not only by diffraction effects and geometrical aberrations, but to scattering effects due to residual optical fabrication errors as well. This generalized surface scatter theory provides insight and understanding by characterizing surface scatter behavior with a surface transfer function closely related to the modulation transfer function of classical image formation theory. Incorporating the inherently band-limited relevant surface roughness into the surface scatter theory provides mathematical rigor into surface scatter analysis, and implementing a fast Fourier transform algorithm with logarithmically spaced data points facilitates the practical calculation of scatter behavior from surfaces with a large dynamic range of relevant spatial frequencies. These advances, combined with the continuing increase in computer speed, leave the optical design community in a position to routinely derive the optical fabrication tolerances necessary to satisfy specific image quality requirements during the design phase of a project; i.e., to integrate optical metrology and fabrication into the optical design process.

  10. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  11. Simulative design and process optimization of the two-stage stretch-blow molding process

    SciTech Connect

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  12. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  13. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  14. Space Shuttle Ascent Flight Design Process: Evolution and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Picka, Bret A.; Glenn, Christopher B.

    2011-01-01

    The Space Shuttle Ascent Flight Design team is responsible for defining a launch to orbit trajectory profile that satisfies all programmatic mission objectives and defines the ground and onboard reconfiguration requirements for this high-speed and demanding flight phase. This design, verification and reconfiguration process ensures that all applicable mission scenarios are enveloped within integrated vehicle and spacecraft certification constraints and criteria, and includes the design of the nominal ascent profile and trajectory profiles for both uphill and ground-to-ground aborts. The team also develops a wide array of associated training, avionics flight software verification, onboard crew and operations facility products. These key ground and onboard products provide the ultimate users and operators the necessary insight and situational awareness for trajectory dynamics, performance and event sequences, abort mode boundaries and moding, flight performance and impact predictions for launch vehicle stages for use in range safety, and flight software performance. These products also provide the necessary insight to or reconfiguration of communications and tracking systems, launch collision avoidance requirements, and day of launch crew targeting and onboard guidance, navigation and flight control updates that incorporate the final vehicle configuration and environment conditions for the mission. Over the course of the Space Shuttle Program, ascent trajectory design and mission planning has evolved in order to improve program flexibility and reduce cost, while maintaining outstanding data quality. Along the way, the team has implemented innovative solutions and technologies in order to overcome significant challenges. A number of these solutions may have applicability to future human spaceflight programs.

  15. Climate Monitoring Satellite Designed in a Concurrent Engineering Process

    NASA Astrophysics Data System (ADS)

    Bauer, Waldemar; Braukhane, A.; Quantius, D.; Dumont, E.; Grundmann, J. T.; Romberg, O.

    An effective method of detecting Green House Gases (GHG CO2 and CH4) is using satellites, operating in Low Earth Orbit (LEO). Satellite based greenhouse gas emissions monitoring is challenging and shows an ambitions level of requirements. Until now for corresponding scientific payload it is common to use a purpose-built satellite bus, or to install the payload on board of a larger conventional satellite. These approaches fulfils all customer requirements but could be critical from a financial point of view. Between 2014 and 2020, no space-based CH4 detection and if at all limited CO2 detection capabilities are planned internationally. In order to fill this gap the Institute for Environmental Physics (IUP) of the University of Bremen plans a GHG satellite mission with near-surface sensitivity called "CarbonSat". It shall perform synchronous global atmospheric CO2 and CH4 observations with the accuracy, precision and coverage needed to significantly advance our knowledge about the sources and sinks of Green House Gases. In order to verify technical and financial opportunities of a small satellite a Concurrent Engi-neering Study (CE-study) has been performed at DLR Bremen, Germany. To reuse knowledge in compact satellite design, the Compact/SSB (Standard Satellite Bus) was chosen as baseline design. The SSB has been developed by DLR and was already used for BIRD (Bispectral Infra-Red Detection) mission but also adapted to the ongoing missions like TET (Technologie-Erprobungs-Trüger) or AsteroidFinder. This paper deals with the highly effective design process a within the DLR-CE-Facility and with the outcomes of the CE-study. It gives an overview of the design status as well as an outlook for comparable missions.

  16. Preconceptual design of a salt splitting process using ceramic membranes

    SciTech Connect

    Kurath, D.E.; Brooks, K.P.; Hollenberg, G.W.; Clemmer, R.; Balagopal, S.; Landro, T.; Sutija, D.P.

    1997-01-01

    Inorganic ceramic membranes for salt splitting of radioactively contaminated sodium salt solutions are being developed for treating U. S. Department of Energy tank wastes. The process consists of electrochemical separation of sodium ions from the salt solution using sodium (Na) Super Ion Conductors (NaSICON) membranes. The primary NaSICON compositions being investigated are based on rare- earth ions (RE-NaSICON). Potential applications include: caustic recycling for sludge leaching, regenerating ion exchange resins, inhibiting corrosion in carbon-steel tanks, or retrieving tank wastes; reducing the volume of low-level wastes volume to be disposed of; adjusting pH and reducing competing cations to enhance cesium ion exchange processes; reducing sodium in high-level-waste sludges; and removing sodium from acidic wastes to facilitate calcining. These applications encompass wastes stored at the Hanford, Savannah River, and Idaho National Engineering Laboratory sites. The overall project objective is to supply a salt splitting process unit that impacts the waste treatment and disposal flowsheets and meets user requirements. The potential flowsheet impacts include improving the efficiency of the waste pretreatment processes, reducing volume, and increasing the quality of the final waste disposal forms. Meeting user requirements implies developing the technology to the point where it is available as standard equipment with predictable and reliable performance. This report presents two preconceptual designs for a full-scale salt splitting process based on the RE-NaSICON membranes to distinguish critical items for testing and to provide a vision that site users can evaluate.

  17. From Safe Nanomanufacturing to Nanosafe-by-Design processes

    NASA Astrophysics Data System (ADS)

    Schuster, F.; Lomello, F.

    2013-04-01

    Industrial needs in terms of multifunctional components are increasing. Many sectors are concerned, from the integrated direct nanoparticles production to the emerging combinations which include the metal matrix composites (MMC), ductile ceramics and ceramic matrix composites, polymer matrix composites (PMC) for bulk application and advanced surface coatings in the fields of automotive, aerospace, energy production and building applications. Moreover, domains with a planetary impact such as environmental issues, as well as aspects for instance health (toxicity) and hazard assessment (ignition and explosion severity) were also taken into account. Nanotechnologies play an important role in promoting innovation in design and realization of multifunctional products for the future, either by improving usual products or creating new functions and/or new products. Nevertheless, this huge evolution in terms of materials could only be promoted by increasing the social acceptance and by acting on the different main technological and economic challenges and developing safe oriented processes. Nowadays, a huge number of developments of nanoparticles are potentially industrial up-scalable. However, some doubts exist about the handling's safety of the current technologies. For these reasons, the main purpose was to develop a self-monitored automation in the production line coupling different techniques in order to simplify processes such as in-situ growth nanoparticles into a nanostructured matrix, over different substrates and/or the nanopowders synthesis, functionalization, dry or wet safe recovery system, granulation, consolidation in single-step, by monitoring at real time the processing parameters such as powder stoichiometry. With the aim of assuring the traceability of the product during the whole life, starting from the conception and including the R&D, the distribution and the use were also considered. The optimization in terms of processing, recovery and conditioning

  18. Superior metallic alloys through rapid solidification processing (RSP) by design

    SciTech Connect

    Flinn, J.E.

    1995-05-01

    Rapid solidification processing using powder atomization methods and the control of minor elements such as oxygen, nitrogen, and carbon can provide metallic alloys with superior properties and performance compared to conventionally processing alloys. Previous studies on nickel- and iron-base superalloys have provided the baseline information to properly couple RSP with alloy composition, and, therefore, enable alloys to be designed for performance improvements. The RSP approach produces powders, which need to be consolidated into suitable monolithic forms. This normally involves canning, consolidation, and decanning of the powders. Canning/decanning is expensive and raises the fabrication cost significantly above that of conventional, ingot metallurgy production methods. The cost differential can be offset by the superior performance of the RSP metallic alloys. However, without the performance database, it is difficult to convince potential users to adopt the RSP approach. Spray casting of the atomized molten droplets into suitable preforms for subsequent fabrication can be cost competitive with conventional processing. If the fine and stable microstructural features observed for the RSP approach are preserved during spray casing, a cost competitive product can be obtained that has superior properties and performance that cannot be obtained by conventional methods.

  19. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  20. Tools for efficient design of multicomponent separation processes

    NASA Astrophysics Data System (ADS)

    Huff, Joshua Lee

    formulation and the relative effect of capital and operating cost is weighed for an example feed. Previous methods based on Underwood's equations have no accounting for the temperature at which utilities are required. To account for this, a thermodynamic efficiency function is developed which allows the complete search space to be ranklisted in order of the exergy loss occurring within the configuration. Examining these results shows that this objective function favors configurations which move their reboiler and condenser duties to milder temperature exchangers. A graphical interface is presented which allows interpretation of any of the above results in a quick and intuitive fashion, complete with system flow and composition data and the ability to filter the complete search space based on numerical and structural criteria. This provides a unique way to compare and contrast configurations as well as allowing considerations like column retrofit and maximum controllability to be considered. Using all five of these screening techniques, the traditional intuition-based methods of separations process design can be augmented with analytical and algorithmic tools which enable selection of a process design with low cost and high efficiency.

  1. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    NASA Astrophysics Data System (ADS)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  2. Process and Prospects for the Designed Hydrograph, Lower Missouri River

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Galat, D. L.; Hay, C. H.

    2005-05-01

    The flow regime of the Lower Missouri River (LMOR, Gavins Point, SD to St. Louis, MO) is being redesigned to restore elements of natural variability while maintaining project purposes such as power production, flood control, water supply, and navigation. Presently, an experimental hydrograph alteration is planned for Spring, 2006. Similar to many large, multi-purpose rivers, the ongoing design process involves negotiation among many management and stakeholder groups. The negotiated process has simplified the hydrograph into two key elements -- the spring rise and the summer low - with emphasis on the influence of these elements on three threatened or endangered species. The spring rise has been hypothesized to perform three functions: build sandbars for nesting of the interior least tern and piping plover, provide episodic connectivity with low-lying flood plain, and provide a behavioral spawning cue for the pallid sturgeon. Among these, most emphasis has been placed on the spawning cue because concerns about downstream flood hazards have limited flow magnitudes to those that are thought to be geomorphically ineffective, and channelization and incision provide little opportunity for moderate flows to connect to the flood plain. Our analysis of the natural hydrologic regime provides some insight into possible spring rise design elements, including timing, rate of rise and fall, and length of spring flow pulses. The summer low has been hypothesized to emerge sandbars for nesting and to maximize area of shallow, slow water for rearing of larval and juvenile fish. Re-engineering of the navigation channel to provide greater diversity of habitat during navigation flows has been offered as an alternative to the summer low. Our analysis indicates that re-engineering has potential to increase habitat availability substantially, but the ecological results are so-far unknown. The designed hydrograph that emerges from the multi-objective process will likely represent a

  3. Conceptual Design for the Pilot-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Jones, Susan A.; Rapko, Brian M.

    2014-08-05

    This report describes a conceptual design for a pilot-scale capability to produce plutonium oxide for use as exercise and reference materials, and for use in identifying and validating nuclear forensics signatures associated with plutonium production. This capability is referred to as the Pilot-scale Plutonium oxide Processing Unit (P3U), and it will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including plutonium dioxide (PuO2) dissolution, purification of the Pu by ion exchange, precipitation, and conversion to oxide by calcination.

  4. Process design and evaluation of production of bioethanol and β-lactam antibiotic from lignocellulosic biomass.

    PubMed

    Kim, Sung Bong; Park, Chulhwan; Kim, Seung Wook

    2014-11-01

    To design biorefinery processes producing bioethanol from lignocellulosic biomass with dilute acid pretreatment, biorefinery processes were simulated using the SuperPro Designer program. To improve the efficiency of biomass use and the economics of biorefinery, additional pretreatment processes were designed and evaluated, in which a combined process of dilute acid and aqueous ammonia pretreatments, and a process of waste media containing xylose were used, for the production of 7-aminocephalosporanic acid. Finally, the productivity and economics of the designed processes were compared.

  5. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use

  6. Hairy root culture: bioreactor design and process intensification.

    PubMed

    Stiles, Amanda R; Liu, Chun-Zhao

    2013-01-01

    The cultivation of hairy roots for the production of secondary metabolites offers numerous advantages; hairy roots have a fast growth rate, are genetically stable, and are relatively simple to maintain in phytohormone free media. Hairy roots provide a continuous source of secondary metabolites, and are useful for the production of chemicals for pharmaceuticals, cosmetics, and food additives. In order for hairy roots to be utilized on a commercial scale, it is necessary to scale-up their production. Over the last several decades, significant research has been conducted on the cultivation of hairy roots in various types of bioreactor systems. In this review, we discuss the advantages and disadvantages of various bioreactor systems, the major factors related to large-scale bioreactor cultures, process intensification technologies and overview the mathematical models and computer-aided methods that have been utilized for bioreactor design and development.

  7. fMRI paradigm designing and post-processing tools

    PubMed Central

    James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  8. An advanced microcomputer design for processing of semiconductor materials

    NASA Technical Reports Server (NTRS)

    Bjoern, L.; Lindkvist, L.; Zaar, J.

    1988-01-01

    In the Get Away Special 330 payload two germanium samples doped with gallium will be processed. The aim of the experiments is to create a planar solid/liquid interface, and to study the breakdown of this interface as the crystal growth rate increases. For the experiments a gradient furnace was designed which is heated by resistive heaters. Cooling is provided by circulating gas from the atmosphere in the cannister through cooling channels in the furnace. The temperature along the sample are measured by platinum/rhodium thermocouples. The furnace is controlled by a microcomputer system, based upon the processor 80C88. A data acquisition system is integrated into the system. In order to synchronize the different actions in time, a multitask manager is used.

  9. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  10. Identifying User Needs and the Participative Design Process

    NASA Astrophysics Data System (ADS)

    Meiland, Franka; Dröes, Rose-Marie; Sävenstedt, Stefan; Bergvall-Kåreborn, Birgitta; Andersson, Anna-Lena

    As the number of persons with dementia increases and also the demands on care and support at home, additional solutions to support persons with dementia are needed. The COGKNOW project aims to develop an integrated, user-driven cognitive prosthetic device to help persons with dementia. The project focuses on support in the areas of memory, social contact, daily living activities and feelings of safety. The design process is user-participatory and consists of iterative cycles at three test sites across Europe. In the first cycle persons with dementia and their carers (n = 17) actively participated in the developmental process. Based on their priorities of needs and solutions, on their disabilities and after discussion between the team, a top four list of Information and Communication Technology (ICT) solutions was made and now serves as the basis for development: in the area of remembering - day and time orientation support, find mobile service and reminding service, in the area of social contact - telephone support by picture dialling, in the area of daily activities - media control support through a music playback and radio function, and finally, in the area of safety - a warning service to indicate when the front door is open and an emergency contact service to enhance feelings of safety. The results of this first project phase show that, in general, the people with mild dementia as well as their carers were able to express and prioritize their (unmet) needs, and the kind of technological assistance they preferred in the selected areas. In next phases it will be tested if the user-participatory design and multidisciplinary approach employed in the COGKNOW project result in a user-friendly, useful device that positively impacts the autonomy and quality of life of persons with dementia and their carers.

  11. Design Process of an Area-Efficient Photobioreactor

    PubMed Central

    Janssen, Marcel; Tramper, Johannes; Wijffels, René H.

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved. PMID:18266033

  12. Design process of an area-efficient photobioreactor.

    PubMed

    Zijffers, Jan-Willem F; Janssen, Marcel; Tramper, Johannes; Wijffels, René H

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved. PMID:18266033

  13. Optimum Design Of Addendum Surfaces In Sheet Metal Forming Process

    NASA Astrophysics Data System (ADS)

    Debray, K.; Sun, Z. C.; Radjai, R.; Guo, Y. Q.; Dai, L.; Gu, Y. X.

    2004-06-01

    The design of addendum surfaces in sheet forming process is very important for the product quality, but it is very time-consuming and needs tedious trial-error corrections. In this paper, we propose a methodology to automatically generate the addendum surfaces and then to optimize them using a forming modelling solver. The surfaces' parameters are taken as design variables and modified in course of optimization. The finite element mesh is created on the initial addendum surfaces and mapped onto the modified surfaces without remeshing operation. The Feasible Sequential Quadratic Programming (FSQP) is adopted as our algorithm of optimization. Two objective functions are used: the first one is the thickness function to minimize the thickness variation on the workpiece ; the second one is the appearance function aiming to avoid the scratching defects on the external surfaces of panels. The FSQP is combined with our "Inverse Approach" or "One Step Approach" which is a very fast forming solver. This leads to a very efficient optimization procedure. The present methodology is applied to a square box. The addendum surfaces are characterised by four geometrical variables. The influence of optimization criteria is studied and discussed.

  14. Design of dual working electrodes for concentration process in metalloimmunoassay.

    PubMed

    Hori, Nobuyasu; Chikae, Miyuki; Kirimura, Hiroya; Takamura, Yuzuru

    2016-10-01

    Electrochemical immunosensing, particularly through a metalloimmunoassay, is a promising approach for development of point-of-care (POC) diagnostics devices. This study investigated the structure of dual working electrodes (W1 and W2), used in a silver nanoparticles-labeled sandwich-type immunoassay and silver concentration process, paying special attention to the position of W1 relative to W2. The new structures of the dual working electrodes were fabricated for efficient silver concentration and evaluated experimentally, which showed that the duration of prereduction before current measurement decreased from 480 s to 300 s by transforming the position of W1 from 1 line to 2 lines or 6 parts. The experimental results were also compared with numerical simulations based on three-dimensional diffusion, and the prereduction step almost followed the three-dimensional diffusion equation. Using numerical simulations, the ideal structures of dual working electrodes were designed based on relationships between the structures and duration of prereduction or the LOD. In the case of 36 lines at an area ratio of W1 to W1 + W2 of 1 to 10, the prereduction duration decreased to 96 s. The dual working electrodes designed in this study promise to shorten the total analysis time and lower the LOD for POC diagnostics. PMID:27572238

  15. Process and reactor design for biophotolytic hydrogen production.

    PubMed

    Tamburic, Bojan; Dechatiwongse, Pongsathorn; Zemichael, Fessehaye W; Maitland, Geoffrey C; Hellgardt, Klaus

    2013-07-14

    The green alga Chlamydomonas reinhardtii has the ability to produce molecular hydrogen (H2), a clean and renewable fuel, through the biophotolysis of water under sulphur-deprived anaerobic conditions. The aim of this study was to advance the development of a practical and scalable biophotolytic H2 production process. Experiments were carried out using a purpose-built flat-plate photobioreactor, designed to facilitate green algal H2 production at the laboratory scale and equipped with a membrane-inlet mass spectrometry system to accurately measure H2 production rates in real time. The nutrient control method of sulphur deprivation was used to achieve spontaneous H2 production following algal growth. Sulphur dilution and sulphur feed techniques were used to extend algal lifetime in order to increase the duration of H2 production. The sulphur dilution technique proved effective at encouraging cyclic H2 production, resulting in alternating Chlamydomonas reinhardtii recovery and H2 production stages. The sulphur feed technique enabled photobioreactor operation in chemostat mode, resulting in a small improvement in H2 production duration. A conceptual design for a large-scale photobioreactor was proposed based on these experimental results. This photobioreactor has the capacity to enable continuous and economical H2 and biomass production using green algae. The success of these complementary approaches demonstrate that engineering advances can lead to improvements in the scalability and affordability of biophotolytic H2 production, giving increased confidence that H2 can fulfil its potential as a sustainable fuel of the future. PMID:23689756

  16. Design process for NIF laser alignment and beam diagnostics

    SciTech Connect

    Grey, A., LLNL

    1998-06-09

    In a controller for an adaptive optic system designed to correct phase aberrations in a high power laser, the wavefront sensor is a discrete Hartmann-Shack design. It uses an army of lenslets (like a fly` s eye) to focus the laser into 77 spots on a CCD camera. Average local tilt of the wavefront across each lenslet changes the position of its focal spot. The system requires 0.1 pixel accuracy in determining the focal spot location. We determine a small area around each spot` s previous location. Within this area, we calculate the centroid of the light intensity in x and y. This calculation fails if the spot regions overlap. Especially during initial acquisition of a highly distorted beam, distinguishing overlapping spots is difficult. However, low resolution analysis of the overlapping spots allows the system to estimate their positions. With this estimate, it can use the deformable mirror to correct the beam enough so we can detect the spots using conventional image processing.

  17. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  18. HYBRID SULFUR PROCESS REFERENCE DESIGN AND COST ANALYSIS

    SciTech Connect

    Gorensek, M.; Summers, W.; Boltrunis, C.; Lahoda, E.; Allen, D.; Greyvenstein, R.

    2009-05-12

    This report documents a detailed study to determine the expected efficiency and product costs for producing hydrogen via water-splitting using energy from an advanced nuclear reactor. It was determined that the overall efficiency from nuclear heat to hydrogen is high, and the cost of hydrogen is competitive under a high energy cost scenario. It would require over 40% more nuclear energy to generate an equivalent amount of hydrogen using conventional water-cooled nuclear reactors combined with water electrolysis compared to the proposed plant design described herein. There is a great deal of interest worldwide in reducing dependence on fossil fuels, while also minimizing the impact of the energy sector on global climate change. One potential opportunity to contribute to this effort is to replace the use of fossil fuels for hydrogen production by the use of water-splitting powered by nuclear energy. Hydrogen production is required for fertilizer (e.g. ammonia) production, oil refining, synfuels production, and other important industrial applications. It is typically produced by reacting natural gas, naphtha or coal with steam, which consumes significant amounts of energy and produces carbon dioxide as a byproduct. In the future, hydrogen could also be used as a transportation fuel, replacing petroleum. New processes are being developed that would permit hydrogen to be produced from water using only heat or a combination of heat and electricity produced by advanced, high temperature nuclear reactors. The U.S. Department of Energy (DOE) is developing these processes under a program known as the Nuclear Hydrogen Initiative (NHI). The Republic of South Africa (RSA) also is interested in developing advanced high temperature nuclear reactors and related chemical processes that could produce hydrogen fuel via water-splitting. This report focuses on the analysis of a nuclear hydrogen production system that combines the Pebble Bed Modular Reactor (PBMR), under development by

  19. [New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries

    SciTech Connect

    Not Available

    1991-12-31

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  20. (New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries)

    SciTech Connect

    Not Available

    1991-01-01

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  1. Universal Design in Postsecondary Education: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design, "is the…

  2. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  3. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  4. The design of a distributed image processing and dissemination system

    SciTech Connect

    Rafferty, P.; Hower, L.

    1990-01-01

    The design and implementation of a distributed image processing and dissemination system was undertaken and accomplished as part of a prototype communication and intelligence (CI) system, the contingency support system (CSS), which is intended to support contingency operations of the Tactical Air Command. The system consists of six (6) Sun 3/180C workstations with integrated ITEX image processors and three (3) 3/50 diskless workstations located at four (4) system nodes (INEL, base, and mobiles). All 3/180C workstations are capable of image system server functions where as the 3/50s are image system clients only. Distribution is accomplished via both local and wide area networks using standard Defense Data Network (DDN) protocols (i.e., TCP/IP, et al.) and Defense Satellite Communication Systems (DSCS) compatible SHF Transportable Satellite Earth Terminals (TSET). Image applications utilize Sun's Remote Procedure Call (RPC) to facilitate the image system client and server relationships. The system provides functions to acquire, display, annotate, process, transfer, and manage images via an icon, panel, and menu oriented Sunview{trademark} based user interface. Image spatial resolution is 512 {times} 480 with 8-bits/pixel black and white and 12/24 bits/pixel color depending on system configuration. Compression is used during various image display and transmission functions to reduce the dynamic range of image data of 12/6/3/2 bits/pixel depending on the application. Image acquisition is accomplished in real-time or near-real-time by special purpose Itex image hardware. As a result all image displays are highly interactive with attention given to subsecond response time. 3 refs., 7 figs.

  5. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  6. Design of the Laboratory-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Orton, Robert D.; Rapko, Brian M.; Smart, John E.

    2015-05-01

    This report describes a design for a laboratory-scale capability to produce plutonium oxide (PuO2) for use in identifying and validating nuclear forensics signatures associated with plutonium production, as well as for use as exercise and reference materials. This capability will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including PuO2 dissolution, purification of the Pu by ion exchange, precipitation, and re-conversion to PuO2 by calcination.

  7. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  8. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  9. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  10. An acceptable role for computers in the aircraft design process

    NASA Technical Reports Server (NTRS)

    Gregory, T. J.; Roberts, L.

    1980-01-01

    Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.

  11. Type-2 fuzzy model based controller design for neutralization processes.

    PubMed

    Kumbasar, Tufan; Eksin, Ibrahim; Guzelkaya, Mujde; Yesil, Engin

    2012-03-01

    In this study, an inverse controller based on a type-2 fuzzy model control design strategy is introduced and this main controller is embedded within an internal model control structure. Then, the overall proposed control structure is implemented in a pH neutralization experimental setup. The inverse fuzzy control signal generation is handled as an optimization problem and solved at each sampling time in an online manner. Although, inverse fuzzy model controllers may produce perfect control in perfect model match case and/or non-existence of disturbances, this open loop control would not be sufficient in the case of modeling mismatches or disturbances. Therefore, an internal model control structure is proposed to compensate these errors in order to overcome this deficiency where the basic controller is an inverse type-2 fuzzy model. This feature improves the closed-loop performance to disturbance rejection as shown through the real-time control of the pH neutralization process. Experimental results demonstrate the superiority of the inverse type-2 fuzzy model controller structure compared to the inverse type-1 fuzzy model controller and conventional control structures. PMID:22036014

  12. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  13. Process and Design: Applying Technical Writing Theory in the Writing Classroom.

    ERIC Educational Resources Information Center

    Jenkins, Robin David

    1987-01-01

    Suggests that technical writing theory, which views the writing process as a process of design, can be applied in the writing classroom. Presents several strategies for teaching design, including teaching editing by levels, making better assignments, and stressing organization. (MM)

  14. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  15. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  16. Singlet oxygen sensitizing materials based on porous silicone: photochemical characterization, effect of dye reloading and application to water disinfection with solar reactors.

    PubMed

    Manjón, Francisco; Santana-Magaña, Montserrat; García-Fresnadillo, David; Orellana, Guillermo

    2010-06-01

    Photogeneration of singlet molecular oxygen ((1)O(2)) is applied to organic synthesis (photooxidations), atmosphere/water treatment (disinfection), antibiofouling materials and in photodynamic therapy of cancer. In this paper, (1)O(2) photosensitizing materials containing the dyes tris(4,4'-diphenyl-2,2'-bipyridine)ruthenium(II) (1, RDB(2+)) or tris(4,7-diphenyl-1,10-phenanthroline)ruthenium(II) (2, RDP(2+)), immobilized on porous silicone (abbreviated RDB/pSil and RDP/pSil), have been produced and tested for waterborne Enterococcus faecalis inactivation using a laboratory solar simulator and a compound parabolic collector (CPC)-based solar photoreactor. In order to investigate the feasibility of its reuse, the sunlight-exposed RDP/pSil sensitizing material (RDP/pSil-a) has been reloaded with RDP(2+) (RDP/pSil-r). Surprisingly, results for bacteria inactivation with the reloaded material have demonstrated a 4-fold higher efficiency compared to those of either RDP/pSil-a, unused RDB/pSil and the original RDP/pSil. Surface and bulk photochemical characterization of the new material (RDP/pSil-r) has shown that the bactericidal efficiency enhancement is due to aggregation of the silicone-supported photosensitizer on the surface of the polymer, as evidenced by confocal fluorescence lifetime imaging microscopy (FLIM). Photogenerated (1)O(2) lifetimes in the wet sensitizer-doped silicone have been determined to be ten times longer than in water. These facts, together with the water rheology in the solar reactor and the interfacial production of the biocidal species, account for the more effective disinfection observed with the reloaded photosensitizing material. These results extend and improve the operational lifetime of photocatalytic materials for point-of-use (1)O(2)-mediated solar water disinfection.

  17. Singlet oxygen sensitizing materials based on porous silicone: photochemical characterization, effect of dye reloading and application to water disinfection with solar reactors.

    PubMed

    Manjón, Francisco; Santana-Magaña, Montserrat; García-Fresnadillo, David; Orellana, Guillermo

    2010-06-01

    Photogeneration of singlet molecular oxygen ((1)O(2)) is applied to organic synthesis (photooxidations), atmosphere/water treatment (disinfection), antibiofouling materials and in photodynamic therapy of cancer. In this paper, (1)O(2) photosensitizing materials containing the dyes tris(4,4'-diphenyl-2,2'-bipyridine)ruthenium(II) (1, RDB(2+)) or tris(4,7-diphenyl-1,10-phenanthroline)ruthenium(II) (2, RDP(2+)), immobilized on porous silicone (abbreviated RDB/pSil and RDP/pSil), have been produced and tested for waterborne Enterococcus faecalis inactivation using a laboratory solar simulator and a compound parabolic collector (CPC)-based solar photoreactor. In order to investigate the feasibility of its reuse, the sunlight-exposed RDP/pSil sensitizing material (RDP/pSil-a) has been reloaded with RDP(2+) (RDP/pSil-r). Surprisingly, results for bacteria inactivation with the reloaded material have demonstrated a 4-fold higher efficiency compared to those of either RDP/pSil-a, unused RDB/pSil and the original RDP/pSil. Surface and bulk photochemical characterization of the new material (RDP/pSil-r) has shown that the bactericidal efficiency enhancement is due to aggregation of the silicone-supported photosensitizer on the surface of the polymer, as evidenced by confocal fluorescence lifetime imaging microscopy (FLIM). Photogenerated (1)O(2) lifetimes in the wet sensitizer-doped silicone have been determined to be ten times longer than in water. These facts, together with the water rheology in the solar reactor and the interfacial production of the biocidal species, account for the more effective disinfection observed with the reloaded photosensitizing material. These results extend and improve the operational lifetime of photocatalytic materials for point-of-use (1)O(2)-mediated solar water disinfection. PMID:20393668

  18. The Changing Metropolitan Designation Process and Rural America

    ERIC Educational Resources Information Center

    Slifkin, Rebecca T.; Randolph, Randy; Ricketts, Thomas C.

    2004-01-01

    In June 2003, the Office of Management and Budget (OMB) released new county-based designations of Core Based Statistical Areas (CBSAs), replacing Metropolitan Statistical Area designations that were last revised in 1990. In this article, the new designations are briefly described, and counties that have changed classifications are identified.…

  19. Reliability and the design process at Honeywell Avionics Division

    NASA Technical Reports Server (NTRS)

    Bezat, A.

    1981-01-01

    The division's philosophy for designed-in reliability and a comparison of reliability programs for space, manned military aircraft, and commercial aircraft, are presented. Topics include: the reliability interface with design and production; the concept phase through final proposal; the design, development, test and evaluation phase; the production phase; and the commonality among space, military, and commercial avionics.

  20. Human Factors Inputs to the Training Device Design Process.

    ERIC Educational Resources Information Center

    Smode, Alfred F.

    Guidelines are presented for achieving human factors inputs to the design of synthetic training systems. A method is developed for design and organization of training concepts and data supportive to the human factors specialist in deriving the functional specifications for the design of any complex training device. Three major sections are…

  1. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  2. Using GREENSCOPE Indicators for Sustainable Computer-Aided Process Evaluation and Design

    EPA Science Inventory

    Manufacturing sustainability can be increased by educating those who design, construct, and operate facilities, and by using appropriate tools for process evaluation and design. The U.S. Environmental Protection Agency's GREENSCOPE methodology and tool, for evaluation and design ...

  3. A novel process for recovery of fermentation-derived succinic acid: process design and economic analysis.

    PubMed

    Orjuela, Alvaro; Orjuela, Andrea; Lira, Carl T; Miller, Dennis J

    2013-07-01

    Recovery and purification of organic acids produced in fermentation constitutes a significant fraction of total production cost. In this paper, the design and economic analysis of a process to recover succinic acid (SA) via dissolution and acidification of succinate salts in ethanol, followed by reactive distillation to form succinate esters, is presented. Process simulation was performed for a range of plant capacities (13-55 million kg/yr SA) and SA fermentation titers (50-100 kg/m(3)). Economics were evaluated for a recovery system installed within an existing fermentation facility producing succinate salts at a cost of $0.66/kg SA. For a SA processing capacity of 54.9 million kg/yr and a titer of 100 kg/m(3) SA, the model predicts a capital investment of $75 million and a net processing cost of $1.85 per kg SA. Required selling price of diethyl succinate for a 30% annual return on investment is $1.57 per kg.

  4. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  5. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  6. Direct selective laser sintering of high performance metals: Machine design, process development and process control

    NASA Astrophysics Data System (ADS)

    Das, Suman

    1998-11-01

    development of machine, processing and control technologies during this research effort enabled successful production of a number of integrally canned test specimens in Alloy 625 (InconelRTM 625 superalloy) and Ti-6Al-4V alloy. The overall goal of this research was to develop direct SLS of metals armed with a fundamental understanding of the underlying physics. The knowledge gained from experimental and analytical work is essential for three key objectives: machine design, process development and process control. (Abstract shortened by UMI.)

  7. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  8. Collaborative Course Design: Changing the Process, Acknowledging the Context, and Implications for Academic Development

    ERIC Educational Resources Information Center

    Ziegenfuss, Donna Harp; Lawler, Patricia A.

    2008-01-01

    This research study describes the experiences and perceptions of an instructor and an instructional design specialist who collaborated on the design and implementation of a university course using a new course design process. Findings uncovered differences between an informal collaboration process and the adaptation of that process for…

  9. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Procedure for acceptance of revisions of design, process... Device Components § 164.019-9 Procedure for acceptance of revisions of design, process, or materials. (a) The manufacturer shall not change the design, material, manufacturing process, or construction of...

  10. A study of optimizing processes for metallized textile design application

    NASA Astrophysics Data System (ADS)

    Guo, Ronghui

    The purpose of this research is to find an optimum electroless plating process in order to obtain relatively low surface resistance, and improve functional properties and appearance of nickel-plated and copper-plated polyester fabrics. The optimum results indicate that the NiSO4 concentration and temperature of the bath in the plating process are most important factors influencing surface resistance of electroless nickel-plated polyester fabric. However, NiSO4 concentration and pH of the plating bath are most significant factors affecting electroless copper plating. The micro-structures and properties of nickel and copper, and nickel/copper multi-layer plated polyester fabrics have been studied. In the case of electroless nickel plating, the nickel deposit layer becomes more uniform and continuous when prepared at higher NiSO4 concentration and higher bath temperature. As for the electroless copper plating, the surface morphology of the copper deposits indicates that the average diameter of the particles is increased with the rise of NiSO4 concentration and pH. The surface morphology of nickel/copper multi-layer deposits reveals the presence of ultra-fine nodules and the deposits are compact and uniform in size. There is an increase in EMI SE with respect to the rise of Ni 2+ concentration and bath temperature for electroless nickel plating; and EMI SE increases with the rise of Ni2+ concentration and pH of the plating solution for electroless copper plating on polyester fabric. With the same deposit weight, the EMI SE of nickel/copper-plated fabric is greatly higher than that of the nickel-plated fabric, but slightly lower than that of the copper-plated fabric. However, the anti-corrosive property of nickel/copper-plated fabrics is significantly superior to the copper-plated fabrics, but slightly inferior to the nickel-plated fabric. Design application effects have been explored by the controlling plating conditions. The electroless plating parameters play an

  11. Development of Integrated Programs for Aerospace-vehicle Design (IPAD): Product manufacture interactions with the design process

    NASA Technical Reports Server (NTRS)

    Crowell, H. A.

    1979-01-01

    The product manufacturing interactions with the design process and the IPAD requirements to support the interactions are described. The data requirements supplied to manufacturing by design are identified and quantified. Trends in computer-aided manufacturing are discussed and the manufacturing process of the 1980's is anticipated.

  12. Integrating optical fabrication and metrology into the optical design process

    NASA Astrophysics Data System (ADS)

    Harvey, James E.

    2014-12-01

    Image degradation due to scattered radiation from residual optical fabrication errors is a serious problem in many short wavelength (X-ray/EUV) imaging systems. Most commercially-available image analysis codes (ZEMAX, Code V, ASAP, FRED, etc.) currently require the scatter behavior (BSDF data) to be provided as input in order to calculate the image quality of such systems. This BSDF data is difficult to measure and rarely available for the operational wavelengths of interest. Since the smooth-surface approximation is often not satisfied at these short wavelengths, the classical Rayleigh-Rice expression that indicates the BRDF is directly proportional to the surface PSD cannot be used to calculate BRDFs from surface metrology data for even slightly rough surfaces. However, an FFTLog numerical Hankel transform algorithm enables the practical use of the computationally intensive Generalized Harvey-Shack (GHS) surface scatter theory [1] to calculate BRDFs from surface PSDs for increasingly short wavelengths that violate the smooth surface approximation implicit in the Rayleigh-Rice surface scatter theory [2-3]. The recent numerical validation [4] of the GHS theory (a generalized linear systems formulation of surface scatter theory), and an analysis of image degradation due to surface scatter in the presence of aberrations [5] has provided credence to the development of a systems engineering analysis of image quality as degraded not only by diffraction effects and geometrical aberrations, but to scattering effects due to residual optical fabrication errors as well. These advances, combined with the continuing increase in computer speed, leave us poised to fully integrate optical metrology and fabrication into the optical design process.

  13. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  14. Design Ideas, Reflection, and Professional Identity: How Graduate Students Explore the Idea Generation Process

    ERIC Educational Resources Information Center

    Hutchinson, Alisa; Tracey, Monica W.

    2015-01-01

    Within design thinking, designers are responsible for generating, testing, and refining design ideas as a means to refine the design problem and arrive at an effective solution. Thus, understanding one's individual idea generation experiences and processes can be seen as a component of professional identity for designers, which involves the…

  15. An Examination of the Decision-Making Process Used by Designers in Multiple Disciplines

    ERIC Educational Resources Information Center

    Stefaniak, Jill E.; Tracey, Monica W.

    2014-01-01

    Design-thinking is an inductive and participatory process in which designers are required to manage constraints, generate solutions, and follow project timelines in order to complete project goals. The researchers used this exploration study to look at how designers in various disciplinary fields approach design projects. Designers were asked to…

  16. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical

  17. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  18. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  19. Incorporating Academic Standards in Instructional Systems Design Process.

    ERIC Educational Resources Information Center

    Wang, Charles Xiaoxue

    Almost every state is "imposing" academic standards. Helping students to meet those standards is a key task for teachers and school administrators, as well as instructional systems designers. Thus, instructional designers in the K-12 environments are facing the challenge of using appropriately and effectively academic standards in their…

  20. Student Evaluation of CALL Tools during the Design Process

    ERIC Educational Resources Information Center

    Nesbitt, Dallas

    2013-01-01

    This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…

  1. Role of Graphics Tools in the Learning Design Process

    ERIC Educational Resources Information Center

    Laisney, Patrice; Brandt-Pomares, Pascale

    2015-01-01

    This paper discusses the design activities of students in secondary school in France. Graphics tools are now part of the capacity of design professionals. It is therefore apt to reflect on their integration into the technological education. Has the use of intermediate graphical tools changed students' performance, and if so in what direction,…

  2. Authenticity in the Process of Learning about Instructional Design

    ERIC Educational Resources Information Center

    Wilson, Jay R.; Schwier, Richard A.

    2009-01-01

    Authentic learning is touted as a powerful learning approach, particularly in the context of problem-based learning (Savery, 2006). Teaching and learning in the area of instructional design appears to offer a strong fit between the tenets of authentic learning and the practice of instructional design. This paper details the efforts to broaden and…

  3. The Matrix Reloaded: How Sensing the Extracellular Matrix Synchronizes Bacterial Communities

    PubMed Central

    Steinberg, Nitai

    2015-01-01

    In response to chemical communication, bacterial cells often organize themselves into complex multicellular communities that carry out specialized tasks. These communities are frequently referred to as biofilms, which involve the collective behavior of different cell types. Like cells of multicellular eukaryotes, the biofilm cells are surrounded by self-produced polymers that constitute the extracellular matrix (ECM), which binds them to each other and to the surface. In multicellular eukaryotes, it has been evident for decades that cell-ECM interactions control multiple cellular processes during development. While cells both in biofilms and in multicellular eukaryotes are surrounded by ECM and activate various genetic programs, until recently it has been unclear whether cell-ECM interactions are recruited in bacterial communicative behaviors. In this review, we describe the examples reported thus far for ECM involvement in control of cell behavior throughout the different stages of biofilm formation. The studies presented in this review have provided a newly emerging perspective of the bacterial ECM as an active player in regulation of biofilm development. PMID:25825428

  4. Development of Chemical Process Design and Control for Sustainability

    EPA Science Inventory

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy....

  5. Cooperation System Design for the XMDR-Based Business Process

    NASA Astrophysics Data System (ADS)

    Moon, Seokjae; Jung, Gyedong; Hwang, Chigon; Choi, Youngkeun

    This paper proposes a cooperation system for the XMDR-based business process. The proposed system solves the problem of heterogeneousness that may take place regarding interoperability of queries in a XMDR-based business process. Heterogeneousness in an operation of a business process may involve metadata collision, schema collision, or data collision. This can be handled by operating a business process by making use of XMDR-based Global Query and Local Query.

  6. Innovation Process Design: A Change Management and Innovation Dimension Perspective

    NASA Astrophysics Data System (ADS)

    Peisl, Thomas; Reger, Veronika; Schmied, Juergen

    The authors propose an innovative approach to the management of innovation integrating business, process, and maturity dimensions. Core element of the concept is the adaptation of ISO/IEC 15504 to the innovation process including 14 innovation drivers. Two managerial models are applied to conceptualize and visualize the respective innovation strategies, the Balanced Scorecard and a Barriers in Change Processes Model. An illustrative case study shows a practical implementation process.

  7. PROCESS DESIGN MANUAL FOR LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The USEPA guidance on land treatment of municipal and industrial wastewater is updated for the first time since 1984. The significant new technilogical changes include phytoremediation, vadose zone monitoring, new design approaches to surface irrigation, center pivot irrigation,...

  8. Bed occupancy monitoring: data processing and clinician user interface design.

    PubMed

    Pouliot, Melanie; Joshi, Vilas; Goubran, Rafik; Knoefel, Frank

    2012-01-01

    Unobtrusive and continuous monitoring of patients, especially at their place of residence, is becoming a significant part of the healthcare model. A variety of sensors are being used to monitor different patient conditions. Bed occupancy monitoring provides clinicians a quantitative measure of bed entry/exit patterns and may provide information relating to sleep quality. This paper presents a bed occupancy monitoring system using a bed pressure mat sensor. A clinical trial was performed involving 8 patients to collect bed occupancy data. The trial period for each patient ranged from 5-10 weeks. This data was analyzed using a participatory design methodology incorporating clinician feedback to obtain bed occupancy parameters. The parameters extracted include the number of bed exits per night, the bed exit weekly average (including minimum and maximum), the time of day of a particular exit, and the amount of uninterrupted bed occupancy per night. The design of a clinical user interface plays a significant role in the acceptance of such patient monitoring systems by clinicians. The clinician user interface proposed in this paper was designed to be intuitive, easy to navigate and not cause information overload. An iterative design methodology was used for the interface design. The interface design is extendible to incorporate data from multiple sensors. This allows the interface to be part of a comprehensive remote patient monitoring system.

  9. DESIGNING SUSTAINABLE PROCESSES WITH SIMULATION: THE WASTE REDUCTION (WAR) ALGORITHM

    EPA Science Inventory

    The WAR Algorithm, a methodology for determining the potential environmental impact (PEI) of a chemical process, is presented with modifications that account for the PEI of the energy consumed within that process. From this theory, four PEI indexes are used to evaluate the envir...

  10. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  11. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  12. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  13. Knowledge management and process monitoring of pharmaceutical processes in the quality by design paradigm.

    PubMed

    Rathore, Anurag S; Bansal, Anshuman; Hans, Jaspinder

    2013-01-01

    Pharmaceutical processes are complex and highly variable in nature. The complexity and variability associated with these processes result in inconsistent and sometimes unpredictable process outcomes. To deal with the complexity and understand the causes of variability in these processes, in-depth knowledge and thorough understanding of the process and the various factors affecting the process performance become critical. This makes knowledge management and process monitoring an indispensable part of the process improvement efforts for any pharmaceutical organization.

  14. Knowledge management and process monitoring of pharmaceutical processes in the quality by design paradigm.

    PubMed

    Rathore, Anurag S; Bansal, Anshuman; Hans, Jaspinder

    2013-01-01

    Pharmaceutical processes are complex and highly variable in nature. The complexity and variability associated with these processes result in inconsistent and sometimes unpredictable process outcomes. To deal with the complexity and understand the causes of variability in these processes, in-depth knowledge and thorough understanding of the process and the various factors affecting the process performance become critical. This makes knowledge management and process monitoring an indispensable part of the process improvement efforts for any pharmaceutical organization. PMID:23275947

  15. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  16. Design alternatives for process group membership and multicast

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Cooper, Robert; Gleeson, Barry

    1991-01-01

    Process groups are a natural tool for distributed programming, and are increasingly important in distributed computing environments. However, there is little agreement on the most appropriate semantics for process group membership and group communication. These issues are of special importance in the Isis system, a toolkit for distributed programming. Isis supports several styles of process group, and a collection of group communication protocols spanning a range of atomicity and ordering properties. This flexibility makes Isis adaptable to a variety of applications, but is also a source of complexity that limits performance. This paper reports on a new architecture that arose from an effort to simplify Isis process group semantics. Our findings include a refined notion of how the clients of a group should be treated, what the properties of a multicast primitive should be when systems contain large numbers of overlapping groups, and a new construct called the casuality domain. As an illustration, we apply the architecture to the problem of converting processes into fault-tolerant process groups in a manner that is 'transparent' to other processes in the system.

  17. Preparing Instructional Designers for Game-Based Learning: Part III. Game Design as a Collaborative Process

    ERIC Educational Resources Information Center

    Hirumi, Atsusi; Appelman, Bob; Rieber, Lloyd; Van Eck, Richard

    2010-01-01

    In this three part series, four professors who teach graduate level courses on the design of instructional video games discuss their perspectives on preparing instructional designers to optimize game-based learning. Part I set the context for the series and one of four panelists discussed what he believes instructional designers should know about…

  18. AMS-02 antiprotons reloaded

    SciTech Connect

    Kappl, Rolf; Reinert, Annika; Winkler, Martin Wolfgang E-mail: areinert@th.physik.uni-bonn.de

    2015-10-01

    The AMS-02 collaboration has released preliminary data on the antiproton fraction in cosmic rays. The surprisingly hard antiproton spectrum at high rigidity has triggered speculations about a possible primary antiproton component originating from dark matter annihilations. In this note, we employ newly available AMS-02 boron to carbon data to update the secondary antiproton flux within the standard two-zone diffusion model. The new background permits a considerably better fit to the measured antiproton fraction compared to previous estimates. This is mainly a consequence of the smaller slope of the diffusion coefficient favored by the new AMS-02 boron to carbon data.

  19. Oral Insulin Reloaded

    PubMed Central

    Heinemann, Lutz; Plum-Mörschel, Leona

    2014-01-01

    Optimal coverage of insulin needs is the paramount aim of insulin replacement therapy in patients with diabetes mellitus. To apply insulin without breaking the skin barrier by a needle and/or to allow a more physiological provision of insulin are the main reasons triggering the continuous search for alternative routes of insulin administration. Despite numerous attempts over the past 9 decades to develop an insulin pill, no insulin for oral dosing is commercially available. By way of a structured approach, we aim to provide a systematic update on the most recent developments toward an orally available insulin formulation with a clear focus on data from clinical-experimental and clinical studies. Thirteen companies that claim to be working on oral insulin formulations were identified. However, only 6 of these companies published new clinical trial results within the past 5 years. Interestingly, these clinical data reports make up a mere 4% of the considerably high total number of publications on the development of oral insulin formulations within this time period. While this picture clearly reflects the rising research interest in orally bioavailable insulin formulations, it also highlights the fact that the lion’s share of research efforts is still allocated to the preclinical stages. PMID:24876606

  20. The Role of Collaboration in a Comprehensive Programme Design Process in Inclusive Education

    ERIC Educational Resources Information Center

    Zundans-Fraser, Lucia; Bain, Alan

    2016-01-01

    This study focused on the role of collaboration in a comprehensive programme design process in inclusive education. The participants were six members of an inclusive education team and an educational designer who together comprised the design team. The study examined whether collaboration was evident in the practice of programme design and…

  1. PROCESS DESIGN MANUAL FOR SLUDGE TREATMENT AND DISPOSAL

    EPA Science Inventory

    The purpose of this manual is to provide the engineering community and related industry with a new source of information to be used in the planning, design, and operation of present and future wastewater pollution control facilities. This manual supplements this existing knowledg...

  2. Process Design Manual: Wastewater Treatment Facilities for Sewered Small Communities.

    ERIC Educational Resources Information Center

    Leffel, R. E.; And Others

    This manual attempts to describe new treatment methods, and discuss the application of new techniques for more effectively removing a broad spectrum of contaminants from wastewater. Topics covered include: fundamental design considerations, flow equalization, headworks components, clarification of raw wastewater, activated sludge, package plants,…

  3. Investigating Preservice Mathematics Teachers' Manipulative Material Design Processes

    ERIC Educational Resources Information Center

    Sandir, Hakan

    2016-01-01

    Students use concrete manipulatives to form an imperative affiliation between conceptual and procedural knowledge (Balka, 1993). Hence, it is necessary to design specific mathematics manipulatives that focus on different mathematical concepts. Preservice teachers need to know how to make and use manipulatives that stimulate students' thinking as…

  4. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  5. Conjecture Mapping to Optimize the Educational Design Research Process

    ERIC Educational Resources Information Center

    Wozniak, Helen

    2015-01-01

    While educational design research promotes closer links between practice and theory, reporting its outcomes from iterations across multiple contexts is often constrained by the volumes of data generated, and the context bound nature of the research outcomes. Reports tend to focus on a single iteration of implementation without further research to…

  6. Transparent process migration: Design alternatives and the Sprite implementation

    NASA Technical Reports Server (NTRS)

    Douglis, Fred; Ousterhout, John

    1991-01-01

    The Sprite operating system allows executing processes to be moved between hosts at any time. We use this process migration mechanism to offload work onto idle machines, and also to evict migrated processes when idle workstations are reclaimed by their owners. Sprite's migration mechanism provides a high degree of transparency both for migrated processes and for users. Idle machines are identified, and eviction is invoked, automatically by daemon processes. On Sprite it takes up to a few hundred milliseconds on SPARCstation 1 workstations to perform a remote exec, while evictions typically occur in a few seconds. The pmake program uses remote invocation to invoke tasks concurrently. Compilations commonly obtain speedup factors in the range of three to six; they are limited primarily by contention for centralized resources such as file servers. CPU-bound tasks such as simulations can make more effective use of idle hosts, obtaining as much as eight-fold speedup over a period of hours. Process migration has been in regular service for over two years.

  7. Image processing of correlated data by experimental design techniques

    SciTech Connect

    Stern, D.

    1987-01-01

    New classes of algorithms are developed for processing of two-dimensional image data imbedded in correlated noise. The algorithms are based on modifications of standard analysis of variance (ANOVA) techniques ensuring their proper operation in dependent noise. The approach taken in the development of procedures is deductive. First, the theory of modified ANOVA (MANOVA) techniques involving one- and two-way layouts are considered for noise models with autocorrelation matrix (ACM) formed by direct multiplication of rows and columns or tensored correlation matrices (TCM) stressing the special case of the first-order Markov process. Next, the techniques are generalized to include arbitrary, wide-sense stationary (WSS) processes. This permits dealing with diagonal masks which have ACM of a general form even for TCM. As further extension, the theory of Latin square (LS) masks is generalized to include dependent noise with TCM. This permits dealing with three different effects of m levels using only m{sup 2} observations rather than m{sup 3}. Since in many image-processing problems, replication of data is possible, the masking techniques are generalized to replicated data for which the replication is TCM dependent. For all procedures developed, algorithms are implemented which ensure real-time processing of images.

  8. Microwave sensor design for noncontact process monitoring at elevated temperature

    NASA Astrophysics Data System (ADS)

    Yadam, Yugandhara Rao; Arunachalam, Kavitha

    2016-02-01

    In this work we present a microwave sensor for noncontact monitoring of liquid level at high temperatures. The sensor is a high gain, directional conical lensed horn antenna with narrow beam width (BW) designed for operation over 10 GHz - 15 GHz. Sensor design and optimization was carried out using 3D finite element method based electromagnetic (EM) simulation software HFSS®. A rectangular to circular waveguide feed was designed to convert TE10 to TE11 mode for wave propagation in the conical horn. Swept frequency simulations were carried out to optimize antenna flare angle and length to achieve better than -10 dB return loss (S11), standing wave ratio (SWR) less than 2.0, 20° half power BW (HPBW) and 15 dB gain over 10 GHz - 15 GHz. The sensor was fabricated using Aluminum and was characterized in an anechoic test box using a vector network analyzer (E5071C, Agilent Technologies, USA). Experimental results of noncontact level detection are presented for boiling water in a metal canister.

  9. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  10. Design of a High-Throughput Plasma-Processing System

    SciTech Connect

    Darkazalli, Ghazi; Matthei, Keith; Ruby, Douglas S.

    1999-07-20

    Sandia National Laboratories has demonstrated significant performance gains in crystalline silicon solar cell technology through the use of plasma-processing for the deposition of silicon nitride by Plasma Enhanced Chemical Vapor Deposition (PECVD), plasma-hydrogenation of the nitride layer, and reactive-ion etching of the silicon surface prior to the deposition to decrease the reflectivity of the surface. One of the major problems of implementing plasma processing into a cell production line is the batch configuration and/or low throughput of the systems currently available. This report describes the concept of a new in-line plasma processing system that could meet the industrial requirements for a high-throughput and cost effective solution for mass production of solar cells.

  11. Microstructure Sensitive Design and Processing in Solid Oxide Electrolyzer Cell

    SciTech Connect

    Dr. Hamid Garmestani; Dr. Stephen Herring

    2009-06-12

    The aim of this study was to develop and inexpensive manufacturing process for deposition of functionally graded thin films of LSM oxides with porosity graded microstructures for use as IT-SOFCs cathode. The spray pyrolysis method was chosen as a low-temperature processing technique for deposition of porous LSM films onto dense YXZ substrates. The effort was directed toward the optimization of the processing conditions for deposition of high quality LSM films with variety of morphologies in the range of dense to porous microstructures. Results of optimization studies of spray parameters revealed that the substrate surface temperature is the most critical parameter influencing the roughness and morphology, porosity, cracking and crystallinity of the film.

  12. Process design for wastewater treatment: catalytic ozonation of organic pollutants.

    PubMed

    Derrouiche, S; Bourdin, D; Roche, P; Houssais, B; Machinal, C; Coste, M; Restivo, J; Orfão, J J M; Pereira, M F R; Marco, Y; Garcia-Bordeje, E

    2013-01-01

    Emerging micropollutants have been recently the target of interest for their potential harmful effects in the environment and their resistance to conventional water treatments. Catalytic ozonation is an advanced oxidation process consisting of the formation of highly reactive radicals from the decomposition of ozone promoted by a catalyst. Nanocarbon materials have been shown to be effective catalysts for this process, either in powder form or grown on the surface of a monolithic structure. In this work, carbon nanofibers grown on the surface of a cordierite honeycomb monolith are tested as catalyst for the ozonation of five selected micropollutants: atrazine (ATZ), bezafibrate, erythromycin, metolachlor, and nonylphenol. The process is tested both in laboratorial and real conditions. Later on, ATZ was selected as a target pollutant to further investigate the role of the catalytic material. It is shown that the inclusion of a catalyst improves the mineralization degree compared to single ozonation. PMID:24056437

  13. Seventeen Projects Carried out by Students Designing for and with Disabled Children: Identifying Designers' Difficulties during the Whole Design Process

    ERIC Educational Resources Information Center

    Magnier, Cecile; Thomann, Guillaume; Villeneuve, Francois

    2012-01-01

    This article aims to identify the difficulties that may arise when designing assistive devices for disabled children. Seventeen design projects involving disabled children, engineering students, and special schools were analysed. A content analysis of the design reports was performed. For this purpose, a coding scheme was built based on a review…

  14. Systematic Approach to Computational Design of Gene Regulatory Networks with Information Processing Capabilities.

    PubMed

    Moskon, Miha; Mraz, Miha

    2014-01-01

    We present several measures that can be used in de novo computational design of biological systems with information processing capabilities. Their main purpose is to objectively evaluate the behavior and identify the biological information processing structures with the best dynamical properties. They can be used to define constraints that allow one to simplify the design of more complex biological systems. These measures can be applied to existent computational design approaches in synthetic biology, i.e., rational and automatic design approaches. We demonstrate their use on a) the computational models of several basic information processing structures implemented with gene regulatory networks and b) on a modular design of a synchronous toggle switch.

  15. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  16. Process-Oriented Design: Conversational Interfaces for Global Accessibility

    ERIC Educational Resources Information Center

    Robertson, Amanda

    2005-01-01

    The ability of the Internet to serve as a bridge to cultural understanding relies in great part on issues related to accessibility. My focus in this article is on accessibility as it relates to providing individuals with the full capabilities of the Internet to facilitate a process of association and learning, which can alleviate many issues that…

  17. EVALUATING AND DESIGNING CHEMICAL PROCESSES FOR ENVIRONMENTAL SUSTAINABILITY

    EPA Science Inventory

    Chemicals and chemical processes are at the heart of most environmental problems. This isn't surprising since chemicals make up all of the products we use in our lives. The common use of cjhemicals makes them of high interest for systems analysis, particularly because of environ...

  18. Encyclopedia of chemical processing and design. Volume 19

    SciTech Connect

    Maiketta, J.J.; Cunningham, W.A

    1983-01-01

    This volume contains contributions from chemists and engineers from academia and industry, and covers the environmental impact of energy, epoxy resins, essential oils, and ethanol among other subjects beginning with the letter ''e''. Contents, abridged are: conversion to SI units; energy, low heat sources; engineering contractors; enhanced oil recovery costs; enzyme processing; equipment, used; essential oils; and esterification.

  19. Rethinking the Purposes and Processes for Designing Digital Portfolios

    ERIC Educational Resources Information Center

    Hicks, Troy; Russo, Anne; Autrey, Tara; Gardner, Rebecca; Kabodian, Aram; Edington, Cathy

    2007-01-01

    As digital portfolios become more prevalent in teacher education, the purposes and processes for creating them have become contested. Originally meant to be critical and reflective spaces for learning about multimedia and conceived as contributing to professional growth, research shows that digital portfolios are now increasingly being used to…

  20. Waste Feed Delivery Purex Process Connector Design Pressure

    SciTech Connect

    BRACKENBURY, P.J.

    2000-04-11

    The pressure retaining capability of the PUREX process connector is documented. A context is provided for the connector's current use within existing Projects. Previous testing and structural analyses campaigns are outlined. The deficient condition of the current inventory of connectors and assembly wrenches is highlighted. A brief history of the connector is provided. A bibliography of pertinent references is included.

  1. Design for human factors (DfHF): a grounded theory for integrating human factors into production design processes.

    PubMed

    Village, Judy; Searcy, Cory; Salustri, Filipo; Patrick Neumann, W

    2015-01-01

    The 'design for human factors' grounded theory explains 'how' human factors (HF) went from a reactive, after-injury programme in safety, to being proactively integrated into each step of the production design process. In this longitudinal case study collaboration with engineers and HF Specialists in a large electronics manufacturer, qualitative data (e.g. meetings, interviews, observations and reflections) were analysed using a grounded theory methodology. The central tenet in the theory is that when HF Specialists acclimated to the engineering process, language and tools, and strategically aligned HF to the design and business goals of the organisation, HF became a means to improve business performance. This led to engineers 'pulling' HF Specialists onto their team. HF targets were adopted into engineering tools to communicate HF concerns quantitatively, drive continuous improvement, visibly demonstrate change and lead to benchmarking. Senior management held engineers accountable for HF as a key performance indicator, thus integrating HF into the production design process. Practitioner Summary: Research and practice lack explanations about how HF can be integrated early in design of production systems. This three-year case study and the theory derived demonstrate how ergonomists changed their focus to align with design and business goals to integrate HF into the design process.

  2. INCORPORATING ENVIRONMENTAL AND ECONOMIC CONSIDERATIONS INTO PROCESS DESIGN: THE WASTE REDUCTION (WAR) ALGORITHM

    EPA Science Inventory

    A general theory known as the WAste Reduction (WASR) algorithm has been developed to describe the flow and the generation of potential environmental impact through a chemical process. This theory integrates environmental impact assessment into chemical process design Potential en...

  3. Robust design of binary countercurrent adsorption separation processes

    SciTech Connect

    Storti, G. ); Mazzotti, M.; Morbidelli, M.; Carra, S. )

    1993-03-01

    The separation of a binary mixture, using a third component having intermediate adsorptivity as desorbent, in a four section countercurrent adsorption separation unit is considered. A procedure for the optimal and robust design of the unit is developed in the frame of Equilibrium Theory, using a model where the adsorption equilibria are described through the constant selectivity stoichiometric model, while mass-transfer resistances and axial mixing are neglected. By requiring that the unit achieves complete separation, it is possible to identify a set of implicity constraints on the operating parameters, that is, the flow rate ratios in the four sections of the unit. From these constraints explicit bounds on the operating parameters are obtained, thus yielding a region in the operating parameters space, which can be drawn a priori in terms of the adsorption equilibrium constants and the feed composition. This result provides a very convenient tool to determine both optimal and robust operating conditions. The latter issue is addressed by first analyzing the various possible sources of disturbances, as well as their effect on the separation performance. Next, the criteria for the robust design of the unit are discussed. Finally, these theoretical findings are compared with a set of experimental results obtained in a six port simulated moving bed adsorption separation unit operated in the vapor phase.

  4. Process Design Report for Stover Feedstock: Lignocellulosic Biomass to Ethanol Process Design and Economics Utilizing Co-Current Dilute Acid Prehydrolysis and Enzymatic Hydrolysis for Corn Stover

    SciTech Connect

    Aden, A.; Ruth, M.; Ibsen, K.; Jechura, J.; Neeves, K.; Sheehan, J.; Wallace, B.; Montague, L.; Slayton, A.; Lukas, J.

    2002-06-01

    The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update of the ongoing process design and economic analyses at NREL.

  5. INTEC CPP-603 Basin Water Treatment System Closure: Process Design

    SciTech Connect

    Kimmitt, Raymond Rodney; Faultersack, Wendell Gale; Foster, Jonathan Kay; Berry, Stephen Michael

    2002-09-01

    This document describes the engineering activities that have been completed in support of the closure plan for the Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603 Basin Water Treatment System. This effort includes detailed assessments of methods and equipment for performing work in four areas: 1. A cold (nonradioactive) mockup system for testing equipment and procedures for vessel cleanout and vessel demolition. 2. Cleanout of process vessels to meet standards identified in the closure plan. 3. Dismantlement and removal of vessels, should it not be possible to clean them to required standards in the closure plan. 4. Cleanout or removal of pipelines and pumps associated with the CPP-603 basin water treatment system. Cleanout standards for the pipes will be the same as those used for the process vessels.

  6. Design, processing and testing of LSI arrays: Hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.

    1979-01-01

    Mathematical cost factors were generated for both hybrid microcircuit and printed wiring board packaging methods. A mathematical cost model was created for analysis of microcircuit fabrication costs. The costing factors were refined and reduced to formulae for computerization. Efficient methods were investigated for low cost packaging of LSI devices as a function of density and reliability. Technical problem areas such as wafer bumping, inner/outer leading bonding, testing on tape, and tape processing, were investigated.

  7. Process options for nominal 2-K helium refrigeration system designs

    NASA Astrophysics Data System (ADS)

    Knudsen, Peter; Ganni, Venkatarao

    2012-06-01

    Nominal 2-K helium refrigeration systems are frequently used for superconducting radio frequency and magnet string technologies used in accelerators. This paper examines the trade-offs and approximate performance of four basic types of processes used for the refrigeration of these technologies; direct vacuum pumping on a helium bath, direct vacuum pumping using full or partial refrigeration recovery, cold compression, and hybrid compression (i.e., a blend of cold and warm sub-atmospheric compression).

  8. Process Options for Nominal 2-K Helium Refrigeration System Designs

    SciTech Connect

    Peter Knudsen, Venkatarao Ganni

    2012-07-01

    Nominal 2-K helium refrigeration systems are frequently used for superconducting radio frequency and magnet string technologies used in accelerators. This paper examines the trade-offs and approximate performance of four basic types of processes used for the refrigeration of these technologies; direct vacuum pumping on a helium bath, direct vacuum pumping using full or partial refrigeration recovery, cold compression, and hybrid compression (i.e., a blend of cold and warm sub-atmospheric compression).

  9. Geothermal injection treatment: process chemistry, field experiences, and design options

    SciTech Connect

    Kindle, C.H.; Mercer, B.W.; Elmore, R.P.; Blair, S.C.; Myers, D.A.

    1984-09-01

    The successful development of geothermal reservoirs to generate electric power will require the injection disposal of approximately 700,000 gal/h (2.6 x 10/sup 6/ 1/h) of heat-depleted brine for every 50,000 kW of generating capacity. To maintain injectability, the spent brine must be compatible with the receiving formation. The factors that influence this brine/formation compatibility and tests to quantify them are discussed in this report. Some form of treatment will be necessary prior to injection for most situations; the process chemistry involved to avoid and/or accelerate the formation of precipitate particles is also discussed. The treatment processes, either avoidance or controlled precipitation approaches, are described in terms of their principles and demonstrated applications in the geothermal field and, when such experience is limited, in other industrial use. Monitoring techniques for tracking particulate growth, the effect of process parameters on corrosion and well injectability are presented. Examples of brine injection, preinjection treatment, and recovery from injectivity loss are examined and related to the aspects listed above.

  10. A Lead User Approach to Universal Design - Involving Older Adults in the Design Process.

    PubMed

    Raviselvam, Sujithra; Wood, Kristin L; Hölttä-Otto, Katja; Tam, Victoria; Nagarajan, Kamya

    2016-01-01

    The concept of Universal Design has received increasing appreciation over the past two decades. Yet, there are very few existing designs that cater to the needs of extraordinary users who experience some form of physical challenge. Previous work has shown promising results on involving users with physical challenges as lead users - users who have the potential to identify needs that could be latent among the general population. It has also been shown that older adults can act as such lead users. They can help design universal product ideas that satisfy both older adults and the general population. In this paper we build on this and examine if involving older adults in the design phase can result in universal products, products preferred by both older adults and the general population over a current option. Eighty-nine older adult participants and thirty-four general population participants took part in the study. Products were redesigned and prototyped based on the needs of older adults and tested among both populations. Results show that, although older adults and the general population did share certain needs and demands, the majority of older adults had needs and demands that were different from those of the general population. However, even though the needs differed between the populations, on average 89% of the general population participants preferred products designed based on design needs expressed by older adults over the current option. This provides further evidence supporting the use of older adults in designing products for all.

  11. A Lead User Approach to Universal Design - Involving Older Adults in the Design Process.

    PubMed

    Raviselvam, Sujithra; Wood, Kristin L; Hölttä-Otto, Katja; Tam, Victoria; Nagarajan, Kamya

    2016-01-01

    The concept of Universal Design has received increasing appreciation over the past two decades. Yet, there are very few existing designs that cater to the needs of extraordinary users who experience some form of physical challenge. Previous work has shown promising results on involving users with physical challenges as lead users - users who have the potential to identify needs that could be latent among the general population. It has also been shown that older adults can act as such lead users. They can help design universal product ideas that satisfy both older adults and the general population. In this paper we build on this and examine if involving older adults in the design phase can result in universal products, products preferred by both older adults and the general population over a current option. Eighty-nine older adult participants and thirty-four general population participants took part in the study. Products were redesigned and prototyped based on the needs of older adults and tested among both populations. Results show that, although older adults and the general population did share certain needs and demands, the majority of older adults had needs and demands that were different from those of the general population. However, even though the needs differed between the populations, on average 89% of the general population participants preferred products designed based on design needs expressed by older adults over the current option. This provides further evidence supporting the use of older adults in designing products for all. PMID:27534296

  12. An interfaces approach to TES ground data system processing design with the Science Investigator-led Processing System (SIPS)

    NASA Technical Reports Server (NTRS)

    Kurian, R.; Grifin, A.

    2002-01-01

    Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

  13. Advanced Simulation Technology to Design Etching Process on CMOS Devices

    NASA Astrophysics Data System (ADS)

    Kuboi, Nobuyuki

    2015-09-01

    Prediction and control of plasma-induced damage is needed to mass-produce high performance CMOS devices. In particular, side-wall (SW) etching with low damage is a key process for the next generation of MOSFETs and FinFETs. To predict and control the damage, we have developed a SiN etching simulation technique for CHxFy/Ar/O2 plasma processes using a three-dimensional (3D) voxel model. This model includes new concepts for the gas transportation in the pattern, detailed surface reactions on the SiN reactive layer divided into several thin slabs and C-F polymer layer dependent on the H/N ratio, and use of ``smart voxels''. We successfully predicted the etching properties such as the etch rate, polymer layer thickness, and selectivity for Si, SiO2, and SiN films along with process variations and demonstrated the 3D damage distribution time-dependently during SW etching on MOSFETs and FinFETs. We confirmed that a large amount of Si damage was caused in the source/drain region with the passage of time in spite of the existing SiO2 layer of 15 nm in the over etch step and the Si fin having been directly damaged by a large amount of high energy H during the removal step of the parasitic fin spacer leading to Si fin damage to a depth of 14 to 18 nm. By analyzing the results of these simulations and our previous simulations, we found that it is important to carefully control the dose of high energy H, incident energy of H, polymer layer thickness, and over-etch time considering the effects of the pattern structure, chamber-wall condition, and wafer open area ratio. In collaboration with Masanaga Fukasawa and Tetsuya Tatsumi, Sony Corporation. We thank Mr. T. Shigetoshi and Mr. T. Kinoshita of Sony Corporation for their assistance with the experiments.

  14. Power of experimental design studies for the validation of pharmaceutical processes: case study of a multilayer tablet manufacturing process.

    PubMed

    Goutte, F; Guemguem, F; Dragan, C; Vergnault, G; Wehrlé, P

    2002-08-01

    Experimental design studies (EDS) are already widely used in the pharmaceutical industry for drug formulation or process optimization. Rare are the situations in which this methodology is applied for validation purposes. The power of this statistical tool, key element of a global validation strategy, is demonstrated for a multilayer tablet manufacturing process. Applied to the Geomatrix system generally composed of one compression and three granulation processes, time and strictness gains are non-negligible. Experimental design studies are not used in this work for modeling. Introduced at each important step of the process development, they allow for the evaluation of process ruggedness at pilot scale and specifications for full production. A demonstration of the complete control of key process parameters is given, identified throughout preliminary studies.

  15. Nencki Affective Word List (NAWL): the cultural adaptation of the Berlin Affective Word List-Reloaded (BAWL-R) for Polish.

    PubMed

    Riegel, Monika; Wierzba, Małgorzata; Wypych, Marek; Żurawski, Łukasz; Jednoróg, Katarzyna; Grabowska, Anna; Marchewka, Artur

    2015-12-01

    In the present article, we introduce the Nencki Affective Word List (NAWL), created in order to provide researchers with a database of 2,902 Polish words, including nouns, verbs, and adjectives, with ratings of emotional valence, arousal, and imageability. Measures of several objective psycholinguistic features of the words (frequency, grammatical class, and number of letters) are also controlled. The database is a Polish adaptation of the Berlin Affective Word List-Reloaded (BAWL-R; Võ et al., Behavior Research Methods 41:534-538, 2009), commonly used to investigate the affective properties of German words. Affective normative ratings were collected from 266 Polish participants (136 women and 130 men). The emotional ratings and psycholinguistic indexes provided by NAWL can be used by researchers to better control the verbal materials they apply and to adjust them to specific experimental questions or issues of interest. The NAWL is freely accessible to the scientific community for noncommercial use as supplementary material to this article.

  16. Is biomass fractionation by Organosolv-like processes economically viable? A conceptual design study.

    PubMed

    Viell, Jörn; Harwardt, Andreas; Seiler, Jan; Marquardt, Wolfgang

    2013-12-01

    In this work, the conceptual designs of the established Organosolv process and a novel biphasic, so-called Organocat process are developed and analyzed. Solvent recycling and energy integration are emphasized to properly assess economic viability. Both processes show a similar energy consumption (approximately 5 MJ/kg(dry biomass)). However, they still show a lack of economic attractiveness even at larger scale. The Organocat process is more favorable due to more efficient lignin separation. The analysis uncovers the remaining challenges toward an economically viable design. They largely originate from by-products formation, product isolation, and solvent recycling. Necessary improvements in process chemistry, equipment design, energy efficiency and process design are discussed to establish economically attractive Organosolv-like processes of moderate capacity as a building block of a future biorefinery. PMID:24157680

  17. Is biomass fractionation by Organosolv-like processes economically viable? A conceptual design study.

    PubMed

    Viell, Jörn; Harwardt, Andreas; Seiler, Jan; Marquardt, Wolfgang

    2013-12-01

    In this work, the conceptual designs of the established Organosolv process and a novel biphasic, so-called Organocat process are developed and analyzed. Solvent recycling and energy integration are emphasized to properly assess economic viability. Both processes show a similar energy consumption (approximately 5 MJ/kg(dry biomass)). However, they still show a lack of economic attractiveness even at larger scale. The Organocat process is more favorable due to more efficient lignin separation. The analysis uncovers the remaining challenges toward an economically viable design. They largely originate from by-products formation, product isolation, and solvent recycling. Necessary improvements in process chemistry, equipment design, energy efficiency and process design are discussed to establish economically attractive Organosolv-like processes of moderate capacity as a building block of a future biorefinery.

  18. The anaerobic SBR process: basic principles for design and automation.

    PubMed

    Ruiz, C; Torrijos, M; Sousbie, P; Lebrato Martinez, J; Moletta, R

    2001-01-01

    This study has determined the purification performance and the basic principles for the design of an anaerobic SBR (ASBR) to be used to treat wastewater generated in the food industries. Two ASBR's were set up and one fed with a slaughterhouse effluent at low concentration, the other with concentrated dairy wastewater. The maximum loading rate applied should not exceed 4.5 g of COD/L/day for the dilute effluent and 6 g of COD/L/day for the concentrated effluent. At higher loading rates, the reactors become difficult to operate, mainly because of sludge removal problems, and purification efficiency declines. A detailed study of the kinetics (TOC, VFA, rate of biogas production) throughout one treatment cycle led to the development of a simple control strategy based on the monitoring of the biogas production rate which was then applied to the reactor treating the dairy wastewater. After automation, the reactor worked free of problems at an average pollution load of 5.4 g of COD/L/day.

  19. Resistance identification and rational process design in Capacitive Deionization.

    PubMed

    Dykstra, J E; Zhao, R; Biesheuvel, P M; van der Wal, A

    2016-01-01

    Capacitive Deionization (CDI) is an electrochemical method for water desalination employing porous carbon electrodes. To enhance the performance of CDI, identification of electronic and ionic resistances in the CDI cell is important. In this work, we outline a method to identify these resistances. We illustrate our method by calculating the resistances in a CDI cell with membranes (MCDI) and by using this knowledge to improve the cell design. To identify the resistances, we derive a full-scale MCDI model. This model is validated against experimental data and used to calculate the ionic resistances across the MCDI cell. We present a novel way to measure the electronic resistances in a CDI cell, as well as the spacer channel thickness and porosity after assembly of the MCDI cell. We identify that for inflow salt concentrations of 20 mM the resistance is mainly located in the spacer channel and the external electrical circuit, not in the electrodes. Based on these findings, we show that the carbon electrode thickness can be increased without significantly increasing the energy consumption per mol salt removed, which has the advantage that the desalination time can be lengthened significantly.

  20. Algorithm design for a gun simulator based on image processing

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Wei, Ping; Ke, Jun

    2015-08-01

    In this paper, an algorithm is designed for shooting games under strong background light. Six LEDs are uniformly distributed on the edge of a game machine screen. They are located at the four corners and in the middle of the top and the bottom edges. Three LEDs are enlightened in the odd frames, and the other three are enlightened in the even frames. A simulator is furnished with one camera, which is used to obtain the image of the LEDs by applying inter-frame difference between the even and odd frames. In the resulting images, six LED are six bright spots. To obtain the LEDs' coordinates rapidly, we proposed a method based on the area of the bright spots. After calibrating the camera based on a pinhole model, four equations can be found using the relationship between the image coordinate system and the world coordinate system with perspective transformation. The center point of the image of LEDs is supposed to be at the virtual shooting point. The perspective transformation matrix is applied to the coordinate of the center point. Then we can obtain the virtual shooting point's coordinate in the world coordinate system. When a game player shoots a target about two meters away, using the method discussed in this paper, the calculated coordinate error is less than ten mm. We can obtain 65 coordinate results per second, which meets the requirement of a real-time system. It proves the algorithm is reliable and effective.

  1. Concepts and designs of ion implantation equipment for semiconductor processing

    NASA Astrophysics Data System (ADS)

    Rose, Peter H.; Ryding, Geoffrey

    2006-11-01

    Manufacturing ion implantation equipment for doping semiconductors has grown into a two billion dollar business. The accelerators developed for nuclear physics research and isotope separation provided the technology from which ion implanters have been developed but the unique requirements of the semiconductor industry defined the evolution of the architecture of these small accelerators. Key elements will be described including ion generation and beam transport systems as well as the techniques used to achieve uniform doping over large wafers. The wafers are processed one at a time or in batches and are moved in and out of the vacuum by automated handling systems. The productivity of an implanter is of economic importance and there is continuing need to increase the usable beam current especially at low energies.

  2. Design of processes with reactive distillation line diagrams

    SciTech Connect

    Bessling, B.; Schembecker, G.; Simmrock, K.H.

    1997-08-01

    On the basis of the transformation of concentration coordinates, the concept of reactive distillation lines is developed. It is applied to study the feasibility of a reactive distillation with an equilibrium reaction on all trays of a distillation column. The singular points in the distillation line diagrams are characterized in terms of nodes and saddles. Depending on the characterization of the reactive distillation line diagrams, it can be decided whether a column with two feed stages is required. On the basis of the reaction space concept, a procedure for identification of reactive distillation processes is developed, in which the reactive distillation column has to be divided into reactive and nonreactive sections. This can be necessary to overcome the limitations in separation which result from the chemical equilibrium. The concentration profile of this combined reactive/nonreactive distillation column is estimated using combined reactive/nonreactive distillation lines.

  3. Development of Conceptual Design Support Tool Founded on Formalization of Conceptual Design Process for Regenerative Life Support Systems

    NASA Astrophysics Data System (ADS)

    Miyajima, Hiroyuki; Yuhara, Naohiro

    Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.

  4. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  5. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  6. 30 CFR 903.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surface coal mining operations. 903.764 Section 903.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE ARIZONA § 903.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  7. A Tutorial Design Process Applied to an Introductory Materials Engineering Course

    ERIC Educational Resources Information Center

    Rosenblatt, Rebecca; Heckler, Andrew F.; Flores, Katharine

    2013-01-01

    We apply a "tutorial design process", which has proven to be successful for a number of physics topics, to design curricular materials or "tutorials" aimed at improving student understanding of important concepts in a university-level introductory materials science and engineering course. The process involves the identification…

  8. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  9. Analyzing Team Based Engineering Design Process in Computer Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Lee, Dong-Kuk; Lee, Eun-Sang

    2016-01-01

    The engineering design process has been largely implemented in a collaborative project format. Recently, technological advancement has helped collaborative problem solving processes such as engineering design to have efficient implementation using computers or online technology. In this study, we investigated college students' interaction and…

  10. 30 CFR 905.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surface coal mining operations. 905.764 Section 905.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE CALIFORNIA § 905.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  11. The Use of Executive Control Processes in Engineering Design by Engineering Students and Professional Engineers

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Johnson, Scott D.

    2012-01-01

    A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…

  12. A Comparison of Diary Method Variations for Enlightening Form Generation in the Design Process

    ERIC Educational Resources Information Center

    Babapour, Maral; Rehammar, Bjorn; Rahe, Ulrike

    2012-01-01

    This paper presents two studies in which an empirical approach was taken to understand and explain form generation and decisions taken in the design process. In particular, the activities addressing aesthetic aspects when exteriorising form ideas in the design process have been the focus of the present study. Diary methods were the starting point…

  13. 30 CFR 912.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... surface coal mining operations. 912.764 Section 912.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE IDAHO § 912.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  14. 30 CFR 905.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... surface coal mining operations. 905.764 Section 905.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE CALIFORNIA § 905.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  15. 30 CFR 903.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... surface coal mining operations. 903.764 Section 903.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE ARIZONA § 903.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  16. 30 CFR 910.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... surface coal mining operations. 910.764 Section 910.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE GEORGIA § 910.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  17. Integrating ergonomics in design processes: a case study within an engineering consultancy firm.

    PubMed

    Sørensen, Lene Bjerg; Broberg, Ole

    2012-01-01

    This paper reports on a case study within an engineering consultancy firm, where engineering designers and ergonomists were working together on the design of a new hospital sterile processing plant. The objective of the paper is to gain a better understanding of the premises for integrating ergonomics into engineering design processes and how different factors either promote or limit the integration. Based on a grounded theory approach a model illustrating these factors is developed and different hypotheses about how these factors either promote and/or limit the integration of ergonomics into design processes is presented along with the model.

  18. Residential Interior Design as Complex Composition: A Case Study of a High School Senior's Composing Process

    ERIC Educational Resources Information Center

    Smagorinsky, Peter; Zoss, Michelle; Reed, Patty M.

    2006-01-01

    This research analyzed the composing processes of one high school student as she designed the interiors of homes for a course in interior design. Data included field notes, an interview with the teacher, artifacts from the class, and the focal student's concurrent and retrospective protocols in relation to her design of home interiors. The…

  19. Design and Construction Process of Two LEED Certified University Buildings: A Collective Case Study

    ERIC Educational Resources Information Center

    Rich, Kim

    2011-01-01

    This study was conducted at the early stages of integrating LEED into the design process in which a clearer understanding of what sustainable and ecological design was about became evident through the duration of designing and building of two academic buildings on a university campus. In this case study, due to utilizing a grounded theory…

  20. Design for Review - Applying Lessons Learned to Improve the FPGA Review Process

    NASA Technical Reports Server (NTRS)

    Figueiredo, Marco A.; Li, Kenneth E.

    2014-01-01

    Flight Field Programmable Gate Array (FPGA) designs are required to be independently reviewed. This paper provides recommendations to Flight FPGA designers to properly prepare their designs for review in order to facilitate the review process, and reduce the impact of the review time in the overall project schedule.

  1. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  2. Design considerations for solar industrial process heat systems: nontracking and line focus collector technologies

    SciTech Connect

    Kutscher, C.F.

    1981-03-01

    Items are listed that should be considered in each aspect of the design of a solar industrial process heat system. The collector technologies covered are flat-plate, evacuated tube, and line focus. Qualitative design considerations are stressed rather than specific design recommendations. (LEW)

  3. Design and fabrication of an integrated bonding system for tape processing of slapper detonator cables

    SciTech Connect

    Schurman, W.R.

    1986-08-15

    A unique bonding system was designed and built as an addition to the other reel-to-reel processing equipment in the Tape Process Laboratory at Mound. This bonder supports an in-house flexible circuit manufacturing capability which has been established by Mound's Tape Processing Group. This bonder, like most of the equipment in the Tape Process Laboratory, was designed and built to Mound's specifications. This bonding system integrates into one machine all operations necessary to prepare, bond, and test the joint between a flexible printed circuit cable and an etched (copper/Kapton) bridge. Design specifications were established for all operations (cable cleaning, assembly, bonding, resistance testing, vacuum stress test, and closed circuit TV inspection). The design concepts used for the design of the Integrated Bonding System provided for the bonding of different detonator cables and bridge designs with minimal tooling changes. 16 figs.

  4. Operational concepts and implementation strategies for the design configuration management process.

    SciTech Connect

    Trauth, Sharon Lee

    2007-05-01

    This report describes operational concepts and implementation strategies for the Design Configuration Management Process (DCMP). It presents a process-based systems engineering model for the successful configuration management of the products generated during the operation of the design organization as a business entity. The DCMP model focuses on Pro/E and associated activities and information. It can serve as the framework for interconnecting all essential aspects of the product design business. A design operation scenario offers a sense of how to do business at a time when DCMP is second nature within the design organization.

  5. Yucca Mountain Project: ESF Title I design control process review report

    SciTech Connect

    1989-01-19

    The Exploratory Shaft Facility (ESF) Title 1 Design Control Process Review was initiated in response to direction from the Office of Civilian Radioactive Waste Management (OCRWM) (letter: Kale to Gertz, NRC Concerns on Title 1 Design Control Process, November 17, 1988). The direction was to identify the existing documentation that described ``{hor_ellipsis} the design control process and the quality assurance that governed {hor_ellipsis}`` (a) the development of the requirements documents for the ESF design, (b) the various interfaces between activities, (c) analyses and definitions leading to additional requirements in the System Design Requirements Documents and, (d) completion of Title 1 Design. This report provides historical information for general use in determining the extent of the quality assurance program in existence during the ESF Title 1 Design.

  6. Conversion of microalgae to jet fuel: process design and simulation.

    PubMed

    Wang, Hui-Yuan; Bluck, David; Van Wie, Bernard J

    2014-09-01

    Microalgae's aquatic, non-edible, highly genetically modifiable nature and fast growth rate are considered ideal for biomass conversion to liquid fuels providing promise for future shortages in fossil fuels and for reducing greenhouse gas and pollutant emissions from combustion. We demonstrate adaptability of PRO/II software by simulating a microalgae photo-bio-reactor and thermolysis with fixed conversion isothermal reactors adding a heat exchanger for thermolysis. We model a cooling tower and gas floatation with zero-duty flash drums adding solids removal for floatation. Properties data are from PRO/II's thermodynamic data manager. Hydrotreating is analyzed within PRO/II's case study option, made subject to Jet B fuel constraints, and we determine an optimal 6.8% bioleum bypass ratio, 230°C hydrotreater temperature, and 20:1 bottoms to overhead distillation ratio. Process economic feasibility occurs if cheap CO2, H2O and nutrient resources are available, along with solar energy and energy from byproduct combustion, and hydrotreater H2 from product reforming. PMID:24997379

  7. Conversion of microalgae to jet fuel: process design and simulation.

    PubMed

    Wang, Hui-Yuan; Bluck, David; Van Wie, Bernard J

    2014-09-01

    Microalgae's aquatic, non-edible, highly genetically modifiable nature and fast growth rate are considered ideal for biomass conversion to liquid fuels providing promise for future shortages in fossil fuels and for reducing greenhouse gas and pollutant emissions from combustion. We demonstrate adaptability of PRO/II software by simulating a microalgae photo-bio-reactor and thermolysis with fixed conversion isothermal reactors adding a heat exchanger for thermolysis. We model a cooling tower and gas floatation with zero-duty flash drums adding solids removal for floatation. Properties data are from PRO/II's thermodynamic data manager. Hydrotreating is analyzed within PRO/II's case study option, made subject to Jet B fuel constraints, and we determine an optimal 6.8% bioleum bypass ratio, 230°C hydrotreater temperature, and 20:1 bottoms to overhead distillation ratio. Process economic feasibility occurs if cheap CO2, H2O and nutrient resources are available, along with solar energy and energy from byproduct combustion, and hydrotreater H2 from product reforming.

  8. Design criteria for Waste Coolant Processing Facility and preliminary proposal 722 for Waste Coolant Processing Facility

    SciTech Connect

    Not Available

    1991-09-27

    This document contains the design criteria to be used by the architect-engineer (A-E) in the performance of Titles 1 and 2 design for the construction of a facility to treat the biodegradable, water soluble, waste machine coolant generated at the Y-12 plant. The purpose of this facility is to reduce the organic loading of coolants prior to final treatment at the proposed West Tank Farm Treatment Facility.

  9. HAL/SM system functional design specification. [systems analysis and design analysis of central processing units

    NASA Technical Reports Server (NTRS)

    Ross, C.; Williams, G. P. W., Jr.

    1975-01-01

    The functional design of a preprocessor, and subsystems is described. A structure chart and a data flow diagram are included for each subsystem. Also a group of intermodule interface definitions (one definition per module) is included immediately following the structure chart and data flow for a particular subsystem. Each of these intermodule interface definitions consists of the identification of the module, the function the module is to perform, the identification and definition of parameter interfaces to the module, and any design notes associated with the module. Also described are compilers and computer libraries.

  10. Application of the cost-per-good-die metric for process design co-optimization

    NASA Astrophysics Data System (ADS)

    Jhaveri, Tejas; Arslan, Umut; Rovner, Vyacheslav; Strojwas, Andrzej; Pileggi, Larry

    2010-03-01

    The semiconductor industry has pursued a rapid pace of technology scaling to achieve an exponential component cost reduction. Over the years the goal of technology scaling has been distilled down to two discrete targets. Process engineers focus on sustaining wafer costs, while manufacturing smaller dimensions whereas design engineers work towards creating newer IC designs that can feed the next generation of electronic products. In doing so, the impact of process choices made by manufacturing community on the design of ICs and vice-versa were conveniently ignored. Hoever, with the lack of cost effective lithography solutions at the forefront, the process and design communities are struggling to minimize IC die costs by following the described traditional scaling practices. In this paper we discuss a framework for quantifying the economic impact of design and process decisions on the overall product by comparing the cost-per-good-die. We discuss the intricacies involved in computing the cost-per-good-die as we make design and technology choices. We also discuss the impact of design and lithography choices for the 32nm and 22nm technology node. The results demonstrate a strong volume dependence on the optimum design style and corresponding lithography and strategy. Most importantly, using this framework process and design engineers can collaborate to define design style and lithography solutions that will lead to continued IC cost scaling.

  11. An integral design strategy combining optical system and image processing to obtain high resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun

    2016-05-01

    In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.

  12. Some trends and proposals for the inclusion of sustainability in the design of manufacturing process

    NASA Astrophysics Data System (ADS)

    Fradinho, J.; Nedelcu, D.; Gabriel-Santos, A.; Gonçalves-Coelho, A.; Mourão, A.

    2015-11-01

    Production processes are designed to meet requirements of three different natures, quality, cost and time. Environmental concerns have expanded the field of conceptual design through the introduction of sustainability requirements that are driven by the growing societal thoughtfulness about environmental issues. One could say that the major concern has been the definition of metrics or indices for sustainability. However, those metrics usually have some lack of consistency. More than ever, there is a need for an all-inclusive view at any level of decision-making, from the establishing of the design requirements to the implementation of the solutions. According to the Axiomatic Design Theory, sustainable designs are usually coupled designs that should be avoided. This raises a concern related to the very nature of sustainability: the cross effects between the actions that should be considered in the attempt to decouple the design solutions. In terms of production, one should clarify the characterization of the sustainability of production systems. The objectives of this paper are: i) to analyse some trends for approaching the sustainability of the production processes; ii) to define sustainability in terms of requirements for the design of the production processes; iii) to make some proposals based on the Axiomatic Design Theory, in order to establish the principles with which the guidelines for designing production processes must comply; iv) to discuss how to introduce this matter in teaching both manufacturing technology and design of production systems.

  13. Working with Faculty toward Universally Designed Instruction: The Process of Dynamic Course Design

    ERIC Educational Resources Information Center

    Harrisson, Elizabeth G.

    2006-01-01

    Both learner-centered education (LCE) and universal design (UD) require an instructor to be constantly reflective and flexible. But although both focus on the needs of different types of learners, until now LCE has not explicitly included students with disabilities within the array of learners it seeks to serve. And the UD movement, while it…

  14. Virtual Display Design and Evaluation of Clothing: A Design Process Support System

    ERIC Educational Resources Information Center

    Zhang, Xue-Fang; Huang, Ren-Qun

    2014-01-01

    This paper proposes a new computer-aided educational system for clothing visual merchandising and display. It aims to provide an operating environment that supports the various stages of display design in a user-friendly and intuitive manner. First, this paper provides a brief introduction to current software applications in the field of…

  15. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  16. Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin; Hull, Patrick V.

    2015-01-01

    Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.

  17. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    SciTech Connect

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  18. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  19. Function-based design process for an intelligent ground vehicle vision system

    NASA Astrophysics Data System (ADS)

    Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.

    2010-10-01

    An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.

  20. Design Considerations of Polishing Lap for Computer-Controlled Cylindrical Polishing Process

    NASA Technical Reports Server (NTRS)

    Khan, Gufran S.; Gubarev, Mikhail; Arnold, William; Ramsey, Brian D.

    2009-01-01

    This paper establishes a relationship between the polishing process parameters and the generation of mid spatial-frequency error. The consideration of the polishing lap design to optimize the process in order to keep residual errors to a minimum and optimization of the process (speeds, stroke, etc.) and to keep the residual mid spatial-frequency error to a minimum, is also presented.

  1. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  2. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  3. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  4. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. PMID:24309506

  5. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  6. Detailed design procedure for solar industrial-process-heat systems: overview

    SciTech Connect

    Kutscher, C F

    1982-12-01

    A large number of handbooks have been written on the subject of designing solar heating and cooling systems for buildings. One of these is summarized here. Design Approaches for Solar Industrial Process Heat Systems, published in September 1982, addresses the complete spectrum of problems associated with the design of a solar IPH system. A highly general method, derived from computer simulations, is presented for determining actual energy delivered to the process load. Also covered are siting and selection of subsystem components, cost estimation, safety and environmental considerations, and installation concerns. An overview of the design methodology developed is given and some specific examples of technical issues addressed are provided.

  7. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  8. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  9. The Iterative Design Process in Research and Development: A Work Experience Paper

    NASA Technical Reports Server (NTRS)

    Sullivan, George F. III

    2013-01-01

    The iterative design process is one of many strategies used in new product development. Top-down development strategies, like waterfall development, place a heavy emphasis on planning and simulation. The iterative process, on the other hand, is better suited to the management of small to medium scale projects. Over the past four months, I have worked with engineers at Johnson Space Center on a multitude of electronics projects. By describing the work I have done these last few months, analyzing the factors that have driven design decisions, and examining the testing and verification process, I will demonstrate that iterative design is the obvious choice for research and development projects.

  10. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    SciTech Connect

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant`s or licensee`s HSI design.

  11. Developing Elementary Math and Science Process Skills Through Engineering Design Instruction

    NASA Astrophysics Data System (ADS)

    Strong, Matthew G.

    This paper examines how elementary students can develop math and science process skills through an engineering design approach to instruction. The performance and development of individual process skills overall and by gender were also examined. The study, preceded by a pilot, took place in a grade four extracurricular engineering design program in a public, suburban school district. Students worked in pairs and small groups to design and construct airplane models from styrofoam, paper clips, and toothpicks. The development and performance of process skills were assessed through a student survey of learning gains, an engineering design packet rubric (student work), observation field notes, and focus group notes. The results indicate that students can significantly develop process skills, that female students may develop process skills through engineering design better than male students, and that engineering design is most helpful for developing the measuring, suggesting improvements, and observing process skills. The study suggests that a more regular engineering design program or curriculum could be beneficial for students' math and science abilities both in this school and for the elementary field as a whole.

  12. Holistic and Consistent Design Process for Hollow Structures Based on Braided Textiles and RTM

    NASA Astrophysics Data System (ADS)

    Gnädinger, Florian; Karcher, Michael; Henning, Frank; Middendorf, Peter

    2014-06-01

    The present paper elaborates a holistic and consistent design process for 2D braided composites in conjunction with Resin Transfer Moulding (RTM). These technologies allow a cost-effective production of composites due to their high degree of automation. Literature can be found that deals with specific tasks of the respective technologies but there is no work available that embraces the complete process chain. Therefore, an overall design process is developed within the present paper. It is based on a correlated conduction of sub-design processes for the braided preform, RTM-injection, mandrel plus mould and manufacturing. For each sub-process both, individual tasks and reasonable methods to accomplish them are presented. The information flow within the design process is specified and interdependences are illustrated. Composite designers will be equipped with an efficient set of tools because the respective methods regard the complexity of the part. The design process is applied for a demonstrator in a case study. The individual sub-design processes are accomplished exemplarily to judge about the feasibility of the presented work. For validation reasons, predicted braiding angles and fibre volume fractions are compared with measured ones and a filling and curing simulation based on PAM-RTM is checked against mould filling studies. Tool concepts for a RTM mould and mandrels that realise undercuts are tested. The individual process parameters for manufacturing are derived from previous design steps. Furthermore, the compatibility of the chosen fibre and matrix system is investigated based on pictures of a scanning electron microscope (SEM). The annual production volume of the demonstrator part is estimated based on these findings.

  13. Optimal cure cycle design for autoclave processing of thick composites laminates: A feasibility study

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.

    1985-01-01

    The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.

  14. New process modeling [sic], design, and control strategies for energy efficiency, high product quality, and improved productivity in the process industries. Final project report

    SciTech Connect

    Ray, W. Harmon

    2002-06-05

    This project was concerned with the development of process design and control strategies for improving energy efficiency, product quality, and productivity in the process industries. In particular, (i) the resilient design and control of chemical reactors, and (ii) the operation of complex processing systems, was investigated. Specific topics studied included new process modeling procedures, nonlinear controller designs, and control strategies for multiunit integrated processes. Both fundamental and immediately applicable results were obtained. The new design and operation results from this project were incorporated into computer-aided design software and disseminated to industry. The principles and design procedures have found their way into industrial practice.

  15. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  16. An Effective Design Process for the Successful Development of Medical Devices

    NASA Astrophysics Data System (ADS)

    Colvin, Mike

    The most important point in the successful development of a medical device is the proper overall design. The quality, safety, and effectiveness of a device are established during the design phase. The design process is the foundation of the medical device and will be the basis for the device from its inception till the end of its lifetime. There are domestic and international guidelines on the proper steps to develop a medical device. However, these are guides; they do not specify when and how to implement each phase of design control. The guides also do not specify to what depth an organization must go as it progresses in the overall developmental process. The challenge that faces development organizations is to create a design process plan that is simple, straightforward, and not overburdening.

  17. Design and implementation of the parallel processing system of multi-channel polarization images

    NASA Astrophysics Data System (ADS)

    Li, Zhi-yong; Huang, Qin-chao

    2013-08-01

    Compared with traditional optical intensity image processing, polarization images processing has two main problems. One is that the amount of data is larger. The other is that processing tasks is more complex. To resolve these problems, the parallel processing system of multi-channel polarization images is designed by the multi-DSP technique. It contains a communication control unit (CCU) and a data processing array (DPA). CCU controls communications inside and outside the system. Its logics are designed by a FPGA chip. DPA is made up of four Digital Signal Processor (DSP) chips, which are interlinked by the loose coupling method. DPA implements processing tasks including images registration and images synthesis by parallel processing methods. The polarization images parallel processing model is designed on multi levels including the system task, the algorithm and the operation. Its program is designed by the assemble language. While the polarization image resolution is 782x582 pixels, the pixel data length is 12 bits in the experiment. After it received 3 channels of polarization image simultaneously, this system implements parallel task to acquire the target polarization characteristics. Experimental results show that this system has good real-time and reliability. The processing time of images registration is 293.343ms while the registration accuracy achieves 0.5 pixel. The processing time of images synthesis is 3.199ms.

  18. Comprehensive design and process flow configuration for micro and nano tech devices

    NASA Astrophysics Data System (ADS)

    Hahn, Kai; Schmidt, Thilo; Mielke, Matthias; Ortloff, Dirk; Popp, Jens; Brück, Rainer

    2010-04-01

    The development of micro and nano tech devices based on semiconductor manufacturing processes comprises the structural design as well as the definition of the manufacturing process flow. The approach is characterized by application specific fabrication flows, i.e. fabrication processes (built up by a large variety of process steps and materials) depending on the later product. Technology constraints have a great impact on the device design and vice-versa. In this paper we introduce a comprehensive methodology and based on that an environment for customer-oriented product engineering of MEMS products. The development is currently carried out in an international multi-site research project.

  19. Process design of a ball joint, considering caulking and pull-out strength.

    PubMed

    Sin, Bong-Su; Lee, Kwon-Hee

    2014-01-01

    A ball joint for an automobile steering system is a pivot component which is connected to knuckle and lower control arm. The manufacturing process for its caulking comprises spinning and deforming. In this study, the process was simulated by flexible multibody dynamics. The caulking was evaluated qualitatively through numerical analysis and inspecting a plastically deformed shape. The structural responses of a ball joint, namely, pull-out strength and stiffness, are commonly investigated in the development process. Thus, following the caulking analysis, the structural responses were considered. In addition, three design variables related to the manufacturing process were defined, and the effects of design variables with respect to pull-out strength, caulking depth, and maximum stress were obtained by introducing the DOE using an L9 orthogonal array. Finally, the optimum design maximizing the pull-out strength was suggested. For the final design, the caulking quality and the pull-out strength were investigated by making six samples and their tests.

  20. Climbing The Knowledge Mountain - The New Solids Processing Design And Management Manual (Presentation)

    EPA Science Inventory

    The USEPA, Water Environment Federation (WEF) and Water Environment Research Foundation (WERF), under a Cooperative Research and Development Agreement (CRADA), are undertaking a massive effort to produce a Solids Processing Design and Management Manual (Manual). The Manual, repr...

  1. Climbing The Knowledge Mountain - The New Solids Processing Design And Management Manual

    EPA Science Inventory

    The USEPA, Water Environment Federation (WEF) and Water Environment Research Foundation (WERF), under a Cooperative Research and Development Agreement (CRADA), are undertaking a massive effort to produce a Solids Processing Design and Management Manual (Manual). The Manual, repr...

  2. EVALUATING THE ECONOMICS AND ENVIRONMENTAL FRIENDLINESS OF NEWLY DESIGNED OR RETROFITTED CHEMICAL PROCESSES

    EPA Science Inventory

    This work describes a method for using spreadsheet analyses of process designs and retrofits to provide simple and quick economic and environmental evaluations simultaneously. The method focuses attention onto those streams and components that have the largest monetary values and...

  3. 30 CFR 912.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... surface coal mining operations. 912.764 Section 912.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE IDAHO § 912.764 Process for designating areas unsuitable for surface coal mining... coal mining and reclamation operations....

  4. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  5. Automated systems for creative processes in scientific research, design, and robotics

    SciTech Connect

    Glushkov, V.M.; Stognii, A.A.; Biba, I.G.; Vashchenko, N.D.; Galagan, N.I.; Gladun, V.P.; Rabinovich, Z.L.; Sakunov, I.A.; Khomenko, L.V.

    1981-11-01

    The authors give a general description of software that was developed to automate the creative processes in scientific research, design and robotics. The systems APROS, SSP, Analizator-ES and Analizator are discussed. 12 references.

  6. GREENER CHEMICAL PROCESS DESIGN ALTERNATIVES ARE REVEALED USING THE WASTE REDUCTION DECISION SUPPORT SYSTEM (WAR DSS)

    EPA Science Inventory

    The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...

  7. Design of visual prosthesis image processing system based on SoC

    NASA Astrophysics Data System (ADS)

    Guo, Fei; Yang, Yuan; Gao, Yong; Wu, Chuan Ke

    2014-07-01

    This paper presents a visual prosthesis image processing system based on Leon3 SoC (System on Chip) platform. The system is built through GRLIB system development platform. It integrates the image preprocessing IP core, image encoder IP core and image data modulation IP core we designed. We transplant the system to the FPGA development board and verify the system functions. The results show that the designed system can achieve the functions of visual prosthesis image processing system effectively.

  8. ALARA Design Review for the Resumption of the Plutonium Finishing Plant (PFP) Cementation Process Project Activities

    SciTech Connect

    DAYLEY, L.

    2000-06-14

    The requirements for the performance of radiological design reviews are codified in 10CFR835, Occupational Radiation Protection. The basic requirements for the performance of ALARA design reviews are presented in the Hanford Site Radiological Control Manual (HSRCM). The HSRCM has established trigger levels requiring radiological reviews of non-routine or complex work activities. These requirements are implemented in site procedures HNF-PRO-1622 and 1623. HNF-PRO-1622 Radiological Design Review Process requires that ''radiological design reviews [be performed] of new facilities and equipment and modifications of existing facilities and equipment''. In addition, HNF-PRO-1623 Radiological Work Planning Process requires a formal ALARA Review for planned activities that are estimated to exceed 1 person-rem total Dose Equivalent (DE). The purpose of this review is to validate that the original design for the PFP Cementation Process ensures that the principles of ALARA (As Low As Reasonably Achievable) were included in the original project design. That is, that the design and operation of existing Cementation Process equipment and processes allows for the minimization of personnel exposure in its operation, maintenance and decommissioning and that the generation of radioactive waste is kept to a minimum.

  9. Influences of Training and Strategical Information Processing Style on Spatial Performance in Apparel Design

    ERIC Educational Resources Information Center

    Gitimu, Priscilla N.; Workman, Jane E.; Anderson, Marcia A.

    2005-01-01

    The study investigated how performance on a spatial task in apparel design was influenced by training and strategical information processing style. The sample consisted of 278 undergraduate apparel design students from six universities in the U.S. Instruments used to collect data were the Apparel Spatial Visualization Test (ASVT) and the…

  10. Learning from Experts: Fostering Extended Thinking in the Early Phases of the Design Process

    ERIC Educational Resources Information Center

    Haupt, Grietjie

    2015-01-01

    Empirical evidence on the way in which expert designers from different domains cognitively connect their internal processes with external resources is presented in the context of an extended cognition model. The article focuses briefly on the main trends in the extended design cognition theory and in particular on recent trends in information…

  11. Breadth in Design Problem Scoping: Using Insights from Experts to Investigate Student Processes. Research Brief

    ERIC Educational Resources Information Center

    Morozov, Andrew; Kilgore, Deborah; Atman, Cynthia

    2007-01-01

    In this study, the authors used two methods for analyzing expert data: verbal protocol analysis (VPA) and narrative analysis. VPA has been effectively used to describe the design processes employed by engineering students, expert designers, and expert-novice comparative research. VPA involves asking participants to "think aloud" while…

  12. Analogical Reasoning in the Engineering Design Process and Technology Education Applications

    ERIC Educational Resources Information Center

    Daugherty, Jenny; Mentzer, Nathan

    2008-01-01

    This synthesis paper discusses the research exploring analogical reasoning, the role of analogies in the engineering design process, and educational applications for analogical reasoning. Researchers have discovered that analogical reasoning is often a fundamental cognitive tool in design problem solving. Regarding the possible role of analogical…

  13. Learning Effects of a Science Textbook Designed with Adapted Cognitive Process Principles on Grade 5 Students

    ERIC Educational Resources Information Center

    Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho

    2015-01-01

    This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…

  14. A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes

    NASA Technical Reports Server (NTRS)

    Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw

    2004-01-01

    There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.

  15. Architectural Design: An American Indian Process. An Interview with Dennis Sun Rhodes.

    ERIC Educational Resources Information Center

    Barreiro, Jose

    1990-01-01

    A Northern Arapaho architect discusses his design process, which uses American Indian cultures, symbols, and attitudes as creative inspiration; his use of space and design elements from aboriginal housing styles; and his experiences with HUD and the Bureau of Indian Affairs Housing Improvement Program. (SV)

  16. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes of 24…

  17. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  18. Design Research with a Focus on Learning Processes: An Overview on Achievements and Challenges

    ERIC Educational Resources Information Center

    Prediger, Susanne; Gravemeijer, Koeno; Confrey, Jere

    2015-01-01

    Design research continues to gain prominence as a significant methodology in the mathematics education research community. This overview summarizes the origins and the current state of design research practices focusing on methodological requirements and processes of theorizing. While recognizing the rich variations in the foci and scale of design…

  19. Try, Try, Try Again: The Process of Designing New History Assessments

    ERIC Educational Resources Information Center

    Breakstone, Joel

    2014-01-01

    This article considers the design process for new formative history assessments. Over the course of 3 years, my colleagues from the Stanford History Education Group and I designed, piloted, and revised dozens of "History Assessments of Thinking" (HATs). As we created HATs, we sought to gather information about their cognitive validity,…

  20. Collaborative Design Processes: An Active and Reflective Learning Course in Multidisciplinary Collaboration.

    ERIC Educational Resources Information Center

    O'Brien, William J.; Soibelman, Lucio; Elvin, George

    2003-01-01

    In a capstone course, graduate students from two universities participated in collaborative design in the architectural, engineering, and construction industries in multidisciplinary teams via the Internet. Students also developed process designs to integrate technology into multidisciplinary teamwork, combining active and reflective learning.…