Science.gov

Sample records for reload design process

  1. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  2. Modeling and design of a reload PWR core for a 48-month fuel cycle

    SciTech Connect

    McMahon, M.V.; Driscoll, M.J.; Todreas, N.E.

    1997-05-01

    The objective of this research was to use state-of-the-art nuclear and fuel performance packages to evaluate the feasibility and costs of a 48 calendar month core in existing pressurized water reactor (PWR) designs, considering the full range of practical design and economic considerations. The driving force behind this research is the desire to make nuclear power more economically competitive with fossil fuel options by expanding the scope for achievement of higher capacity factors. Using CASMO/SIMULATE, a core design with fuel enriched to 7{sup w}/{sub o} U{sup 235} for a single batch loaded, 48-month fuel cycle has been developed. This core achieves an ultra-long cycle length without exceeding current fuel burnup limits. The design uses two different types of burnable poisons. Gadolinium in the form of gadolinium oxide (Gd{sub 2}O{sub 3}) mixed with the UO{sub 2} of selected pins is sued to hold down initial reactivity and to control flux peaking throughout the life of the core. A zirconium di-boride (ZrB{sub 2}) integral fuel burnable absorber (IFBA) coating on the Gd{sub 2}O{sub 3}-UO{sub 2} fuel pellets is added to reduce the critical soluble boron concentration in the reactor coolant to within acceptable limits. Fuel performance issues of concern to this design are also outlined and areas which will require further research are highlighted.

  3. From Reload to ReCourse: Learning from IMS Learning Design Implementations

    ERIC Educational Resources Information Center

    Griffiths, David; Beauvoir, Phillip; Liber, Oleg; Barrett-Baxendale, Mark

    2009-01-01

    The use of the Web to deliver open, distance, and flexible learning has opened up the potential for social interaction and adaptive learning, but the usability, expressivity, and interoperability of the available tools leave much to be desired. This article explores these issues as they relate to teachers and learning designers through the case of…

  4. Optimal reload strategies for identify-and-destroy missions

    NASA Astrophysics Data System (ADS)

    Hyland, John C.; Smith, Cheryl M.

    2004-09-01

    In this problem an identification vehicle must re-acquire a fixed set of suspected targets and determine whether each suspected target is a mine or a false alarm. If a target is determined to be a mine, the identification vehicle must neutralize it by either delivering one of a limited number of on-board bombs or by assigning the neutralization task to one of a limited number of single-shot suicide vehicles. The identification vehicle has the option to reload. The singleshot suicide vehicles, however, cannot be replenished. We have developed an optimal path planning and reload strategy for this identify and destroy mission that takes into account the probabilities that suspected targets are mines, the costs to move between targets, the costs to return to and from the reload point, and the cost to reload. The mission is modeled as a discrete multi-dimensional Markov process. At each target position the vehicle decides based on the known costs, probabilities, the number of bombs on board (r), and the number of remaining one-shot vehicles (s) whether to move directly on to the next target or to reload before continuing and whether to destroy any mine with an on-board bomb or a one-shot suicide vehicle. The approach recursively calculates the minimum expected overall cost conditioned on all possible values r and s. The recursion is similar to dynamic programming in that it starts at the last suspected target location and works its way backwards to the starting point. The approach also uses a suboptimal traveling salesman strategy to search over candidate deployment locations to calculate the best initial deployment point where the reloads will take place.

  5. NASA reload program

    NASA Technical Reports Server (NTRS)

    Byington, Marshall

    1993-01-01

    Atlantic Research Corporation (ARC) contracted with NASA to manufacture and deliver thirteen small scale Solid Rocket Motors (SRM). These motors, containing five distinct propellant formulations, will be used for plume induced radiation studies. The information contained herein summarizes and documents the program accomplishments and results. Several modifications were made to the scope of work during the course of the program. The effort was on hold from late 1991 through August, 1992 while propellant formulation changes were developed. Modifications to the baseline program were completed in late-August and Modification No. 6 was received by ARC on September 14, 1992. The modifications include changes to the propellant formulation and the nozzle design. The required motor deliveries were completed in late-December, 1992. However, ARC agreed to perform an additional mix and cast effort at no cost to NASA and another motor was delivered in March, 1993.

  6. The Heliogyro Reloaded

    NASA Technical Reports Server (NTRS)

    Wilkie, William K.; Warren, Jerry E.; Thompson, M. W.; Lisman, P. D.; Walkemeyer, P. E.; Guerrant, D. V.; Lawrence, D. A.

    2011-01-01

    The heliogyro is a high-performance, spinning solar sail architecture that uses long - order of kilometers - reflective membrane strips to produce thrust from solar radiation pressure. The heliogyro s membrane blades spin about a central hub and are stiffened by centrifugal forces only, making the design exceedingly light weight. Blades are also stowed and deployed from rolls; eliminating deployment and packaging problems associated with handling extremely large, and delicate, membrane sheets used with most traditional square-rigged or spinning disk solar sail designs. The heliogyro solar sail concept was first advanced in the 1960s by MacNeal. A 15 km diameter version was later extensively studied in the 1970s by JPL for an ambitious Comet Halley rendezvous mission, but ultimately not selected due to the need for a risk-reduction flight demonstration. Demonstrating system-level feasibility of a large, spinning heliogyro solar sail on the ground is impossible; however, recent advances in microsatellite bus technologies, coupled with the successful flight demonstration of reflectance control technologies on the JAXA IKAROS solar sail, now make an affordable, small-scale heliogyro technology flight demonstration potentially feasible. In this paper, we will present an overview of the history of the heliogyro solar sail concept, with particular attention paid to the MIT 200-meter-diameter heliogyro study of 1989, followed by a description of our updated, low-cost, heliogyro flight demonstration concept. Our preliminary heliogyro concept (HELIOS) should be capable of demonstrating an order-of-magnitude characteristic acceleration performance improvement over existing solar sail demonstrators (HELIOS target: 0.5 to 1.0 mm/s2 at 1.0 AU); placing the heliogyro technology in the range required to enable a variety of science and human exploration relevant support missions.

  7. Insulin-like growth factor-1 receptor in mature osteoblasts is required for periosteal bone formation induced by reloading.

    PubMed

    Kubota, Takuo; Elalieh, Hashem Z; Saless, Neema; Fong, Chak; Wang, Yongmei; Babey, Muriel; Cheng, Zhiqiang; Bikle, Daniel D

    2013-11-01

    Skeletal loading and unloading has a pronounced impact on bone remodeling, a process also regulated by insulin-like growth factor 1 (IGF-1) signaling. Skeletal unloading leads to resistance to the anabolic effect of IGF-1, while reloading after unloading restores responsiveness to IGF-1. However, a direct study of the importance of IGF-1 signaling in the skeletal response to mechanical loading remains to be tested. In this study, we assessed the skeletal response of osteoblast-specific Igf-1 receptor deficient (Igf-1r(-/-) ) mice to unloading and reloading. The mice were hindlimb unloaded for 14 days and then reloaded for 16 days. Igf-1r(-/-) mice displayed smaller cortical bone and diminished periosteal and endosteal bone formation at baseline. Periosteal and endosteal bone formation decreased with unloading in Igf-1r(+/+) mice. However, the recovery of periosteal bone formation with reloading was completely inhibited in Igf-1r(-/-) mice, although reloading-induced endosteal bone formation was not hampered. These changes in bone formation resulted in the abolishment of the expected increase in total cross-sectional area with reloading in Igf-1r(-/-) mice compared to the control mice. These results suggest that the Igf-1r in mature osteoblasts has a critical role in periosteal bone formation in the skeletal response to mechanical loading.

  8. Insulin-like growth factor-1 receptor in mature osteoblasts is required for periosteal bone formation induced by reloading

    NASA Astrophysics Data System (ADS)

    Kubota, Takuo; Elalieh, Hashem Z.; Saless, Neema; Fong, Chak; Wang, Yongmei; Babey, Muriel; Cheng, Zhiqiang; Bikle, Daniel D.

    2013-11-01

    Skeletal loading and unloading has a pronounced impact on bone remodeling, a process also regulated by insulin-like growth factor-1 (IGF-1) signaling. Skeletal unloading leads to resistance to the anabolic effect of IGF-1, while reloading after unloading restores responsiveness to IGF-1. However, a direct study of the importance of IGF-1 signaling in the skeletal response to mechanical loading remains to be tested. In this study, we assessed the skeletal response of osteoblast-specific Igf-1 receptor deficient (Igf-1r-/-) mice to unloading and reloading. The mice were hindlimb unloaded for 14 days and then reloaded for 16 days. Igf-1r-/- mice displayed smaller cortical bone and diminished periosteal and endosteal bone formation at baseline. Periosteal and endosteal bone formation decreased with unloading in Igf-1r+/+ mice. However, the recovery of periosteal bone formation with reloading was completely inhibited in Igf-1r-/- mice, although reloading-induced endosteal bone formation was not hampered. These changes in bone formation resulted in the abolishment of the expected increase in total cross-sectional area with reloading in Igf-1r-/- mice compared to the control mice. These results suggest that the Igf-1r in mature osteoblasts has a critical role in periosteal bone formation in the skeletal response to mechanical loading.

  9. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  10. Lyophilization process design space.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2013-11-01

    The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed.

  11. Composite Reliability Enhancement Via Reloading

    DTIC Science & Technology

    1988-09-01

    the design of composite structures , which in turn adds weight and size and causes other related problems that reduce design efficiency. As the...such large structures . This is due to the the lower weak tail of the strength distributions of the con- stituent fibers. This effect has been...tion was run on an IBM Personal Computer using Microsoft Fortran 4.01 for source code and Lotus 1-2-3 for graphing. When the simula- tion program had

  12. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  13. Bassoon Speeds Vesicle Reloading at a Central Excitatory Synapse

    PubMed Central

    Hallermann, Stefan; Fejtova, Anna; Schmidt, Hartmut; Weyhersmüller, Annika; Silver, R. Angus; Gundelfinger, Eckart D.; Eilers, Jens

    2010-01-01

    Summary Sustained rate-coded signals encode many types of sensory modalities. Some sensory synapses possess specialized ribbon structures, which tether vesicles, to enable high-frequency signaling. However, central synapses lack these structures, yet some can maintain signaling over a wide bandwidth. To analyze the underlying molecular mechanisms, we investigated the function of the active zone core component Bassoon in cerebellar mossy fiber to granule cell synapses. We show that short-term synaptic depression is enhanced in Bassoon knockout mice during sustained high-frequency trains but basal synaptic transmission is unaffected. Fluctuation and quantal analysis as well as quantification with constrained short-term plasticity models revealed that the vesicle reloading rate was halved in the absence of Bassoon. Thus, our data show that the cytomatrix protein Bassoon speeds the reloading of vesicles to release sites at a central excitatory synapse. PMID:21092860

  14. Introducing the "Decider" Design Process

    ERIC Educational Resources Information Center

    Prasa, Anthony R., Jr.; Del Guercio, Ryan

    2016-01-01

    Engineers are faced with solving important problems every day and must follow a step-by-step design process to arrive at solutions. Students who are taught an effective design process to apply to engineering projects begin to see problems as an engineer would, consider all ideas, and arrive at the best solution. Using an effective design process…

  15. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  16. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  17. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  18. Myocardial Reloading after Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    SciTech Connect

    Kajimoto, Masaki; Priddy, Colleen M.; Ledee, Dolena; Xu, Chun; Isern, Nancy G.; Olson, Aaron; Des Rosiers, Christine; Portman, Michael A.

    2013-08-19

    Extracorporeal membrane oxygenation (ECMO) unloads the heart providing a bridge to recovery in children after myocardial stunning. Mortality after ECMO remains high.Cardiac substrate and amino acid requirements upon weaning are unknown and may impact recovery. We assessed the hypothesis that ventricular reloading modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Fourteen immature piglets (7.8-15.6 kg) were separated into 2 groups based on ventricular loading status: 8 hour-ECMO (UNLOAD) and post-wean from ECMO (RELOAD). We infused [2-13C]-pyruvate as an oxidative substrate and [13C6]-L-leucine, as a tracer of amino acid oxidation and protein synthesis into the coronary artery. RELOAD showed marked elevations in myocardial oxygen consumption above baseline and UNLOAD. Pyruvate uptake was markedly increased though RELOAD decreased pyruvate contribution to oxidative CAC metabolism.RELOAD also increased absolute concentrations of all CAC intermediates, while maintaining or increasing 13C-molar percent enrichment. RELOAD also significantly increased cardiac fractional protein synthesis rates by >70% over UNLOAD. Conclusions: RELOAD produced high energy metabolic requirement and rebound protein synthesis. Relative pyruvate decarboxylation decreased with RELOAD while promoting anaplerotic pyruvate carboxylation and amino acid incorporation into protein rather than to the CAC for oxidation. These perturbations may serve as therapeutic targets to improve contractile function after ECMO.

  19. The motion after-effect reloaded

    PubMed Central

    Mather, George; Pavan, Andrea; Campana, Gianluca; Casco, Clara

    2011-01-01

    The motion after-effect is a robust illusion of visual motion resulting from exposure to a moving pattern. There is a widely accepted explanation of it in terms of changes in the response of cortical direction-selective neurons. Research has distinguished several variants of the effect. Converging recent evidence from different experimental techniques (psychophysics, single-unit recording, brain imaging, transcranial magnetic stimulation, and evoked potentials) reveals that adaptation is not confined to one or even two cortical areas, but involves up to five different sites, reflecting the multiple levels of processing involved in visual motion analysis. A tentative motion processing framework is described, based on motion after-effect research. Recent ideas on the function of adaptation see it as a form of gain control that maximises the efficiency of information transmission. PMID:18951829

  20. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  1. Reloading Continuous GPS in Northwest Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez-Garcia, J. J.; Suarez-Vidal, F.; Gonzalez-Ortega, J. A.

    2007-05-01

    For more than 10 years we try to follow the steps of the Southern California Integrated GPS Network (SCIGN) and the Plate Boundary Observatory (PBO) in USA, this gives us the opportunity to be in position to contribute to develop a modern GPS Network in Mexico. During 1998 and 2001, three stations were deployed in Northwest Mexico in concert with the development of SCIGN: SPMX in north central Baja California state at the National Astronomical Observatory, UNAM in the Sierra San Pedro Martir; CORX in Isla Coronados Sur, offshore San Diego, Ca./Tijuana, Mexico and GUAX in Guadalupe island 150 miles offshore Baja California peninsula, which provide a unique site on the Pacific plate in the Northamerica/Pacific boundary zone in Las Californias. The former IGS station in CICESE, Ensenada, CICE installed in 1995, was replaced by CIC1 in 1999. In 2004 and 2005 with partial support from SCIGN and UNAVCO to University of Arizona a volunteer team from UNAVCO, Caltech, U.S. Geological Survey, Universidad de la Sierra at Moctezuma Sonora and CICESE built two new shallow-braced GPS sites in northwest Mexico. The first site USMX is located at east-central Sonora and the second YESX is located high in the Sierra Madre Occidental at Yecora near the southern border of Sonora and Chihuahua. All data is openly available at SOPAC and/or UNAVCO. The existing information has been valuable to resolve the "total" plate motion between the Pacific plate (GUAX) and the Northamerica plate (USMX and YESX) in the north- central Gulf of California. Since the last year we have the capability of GPS data processing using GAMIT/GLOBK, and after gain some practice with survey mode data processing we can convert us in a GPS processing center in Mexico. Currently only 2 sites are operational: CIC1 and USMX. With new energy we are ready to contribute to the establishment of a modern GPS network in Mexico for science, hazard monitoring and infrastructure.

  2. Digital Earth reloaded - Beyond the next generation

    NASA Astrophysics Data System (ADS)

    Ehlers, M.; Woodgate, P.; Annoni, A.; Schade, S.

    2014-02-01

    Digital replicas (or 'mirror worlds') of complex entities and systems are now routine in many fields such as aerospace engineering; archaeology; medicine; or even fashion design. The Digital Earth (DE) concept as a digital replica of the entire planet occurs in Al Gore's 1992 book Earth in the Balance and was popularized in his speech at the California Science Center in January 1998. It played a pivotal role in stimulating the development of a first generation of virtual globes, typified by Google Earth that achieved many elements of this vision. Almost 15 years after Al Gore's speech, the concept of DE needs to be re-evaluated in the light of the many scientific and technical developments in the fields of information technology, data infrastructures, citizen?s participation, and earth observation that have taken place since. This paper intends to look beyond the next generation predominantly based on the developments of fields outside the spatial sciences, where concepts, software, and hardware with strong relationships to DE are being developed without referring to this term. It also presents a number of guiding criteria for future DE developments.

  3. The Snark was a Boojum - reloaded

    PubMed Central

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 (“The Snark was a Boojum”). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of “translational research” that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that

  4. The Snark was a Boojum - reloaded.

    PubMed

    Macrì, Simone; Richter, S Helene

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 ("The Snark was a Boojum"). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of "translational research" that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that developing

  5. Process simulation and design '94

    SciTech Connect

    Not Available

    1994-06-01

    This first-of-a-kind report describes today's process simulation and design technology for specific applications. It includes process names, diagrams, applications, descriptions, objectives, economics, installations, licensors, and a complete list of process submissions. Processes include: alkylation, aromatics extraction, catalytic reforming, cogeneration, dehydration, delayed coking, distillation, energy integration, catalytic cracking, gas sweetening, glycol/methanol injection, hydrocracking, NGL recovery and stabilization, solvent dewaxing, visbreaking. Equipment simulations include: amine plant, ammonia plant, heat exchangers, cooling water network, crude preheat train, crude unit, ethylene furnace, nitrogen rejection unit, refinery, sulfur plant, and VCM furnace. By-product processes include: olefins, polyethylene terephthalate, and styrene.

  6. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  7. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference

  8. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  9. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  10. Development of Innovative Design Processor

    SciTech Connect

    Park, Y.S.; Park, C.O.

    2004-07-01

    The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which is another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)

  11. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  12. Reloading partly recovers bone mineral density and mechanical properties in hind limb unloaded rats

    NASA Astrophysics Data System (ADS)

    Zhao, Fan; Li, Dijie; Arfat, Yasir; Chen, Zhihao; Liu, Zonglin; Lin, Yu; Ding, Chong; Sun, Yulong; Hu, Lifang; Shang, Peng; Qian, Airong

    2014-12-01

    Skeletal unloading results in decreased bone formation and bone mass. During long-term space flight, the decreased bone mass is impossible to fully recover. Therefore, it is necessary to develop the effective countermeasures to prevent spaceflight-induced bone loss. Hindlimb Unloading (HLU) simulates effects of weightlessness and is utilized extensively to examine the response of musculoskeletal systems to certain aspects of space flight. The purpose of this study is to investigate the effects of a 4-week HLU in rats and subsequent reloading on the bone mineral density (BMD) and mechanical properties of load-bearing bones. After HLU for 4 weeks, the rats were then subjected to reloading for 1 week, 2 weeks and 3 weeks, and then the BMD of the femur, tibia and lumbar spine in rats were assessed by dual energy X-ray absorptiometry (DXA) every week. The mechanical properties of the femur were determined by three-point bending test. Dry bone and bone ash of femur were obtained through Oven-Drying method and were weighed respectively. Serum alkaline phosphatase (ALP) and serum calcium were examined through ELISA and Atomic Absorption Spectrometry. The results showed that 4 weeks of HLU significantly decreased body weight of rats and reloading for 1 week, 2 weeks or 3 weeks did not recover the weight loss induced by HLU. However, after 2 weeks of reloading, BMD of femur and tibia of HLU rats partly recovered (+10.4%, +2.3%). After 3 weeks of reloading, the reduction of BMD, energy absorption, bone mass and mechanical properties of bone induced by HLU recovered to some extent. The changes in serum ALP and serum calcium induced by HLU were also recovered after reloading. Our results indicate that a short period of reloading could not completely recover bone after a period of unloading, thus some interventions such as mechanical vibration or pharmaceuticals are necessary to help bone recovery.

  13. Muscle regeneration during hindlimb unloading results in a reduction in muscle size after reloading

    NASA Technical Reports Server (NTRS)

    Mozdziak, P. E.; Pulvermacher, P. M.; Schultz, E.

    2001-01-01

    The hindlimb-unloading model was used to study the ability of muscle injured in a weightless environment to recover after reloading. Satellite cell mitotic activity and DNA unit size were determined in injured and intact soleus muscles from hindlimb-unloaded and age-matched weight-bearing rats at the conclusion of 28 days of hindlimb unloading, 2 wk after reloading, and 9 wk after reloading. The body weights of hindlimb-unloaded rats were significantly (P < 0.05) less than those of weight-bearing rats at the conclusion of hindlimb unloading, but they were the same (P > 0.05) as those of weight-bearing rats 2 and 9 wk after reloading. The soleus muscle weight, soleus muscle weight-to-body weight ratio, myofiber diameter, number of nuclei per millimeter, and DNA unit size were significantly (P < 0.05) smaller for the injured soleus muscles from hindlimb-unloaded rats than for the soleus muscles from weight-bearing rats at each recovery time. Satellite cell mitotic activity was significantly (P < 0.05) higher in the injured soleus muscles from hindlimb-unloaded rats than from weight-bearing rats 2 wk after reloading, but it was the same (P > 0.05) as in the injured soleus muscles from weight-bearing rats 9 wk after reloading. The injured soleus muscles from hindlimb-unloaded rats failed to achieve weight-bearing muscle size 9 wk after reloading, because incomplete compensation for the decrease in myonuclear accretion and DNA unit size expansion occurred during the unloading period.

  14. One-dimensional kinetics modifications for BWR reload methods

    SciTech Connect

    Chandola, V.; Robichaud, J.D.

    1990-01-01

    Yankee Atomic Electric Company (YAEC) currently uses RETRAN-02 to analyze limiting transients and establish operating minimum critical power ratio (MCPR) limits for Vermont Yankee (VY) boiling water reactor (BWR) reload analysis. The US Nuclear Regulatory Commission-approved analysis methods, used in previous cycles, use the point-kinetics modeling option in RETRAN-02 to represent transient-induced neutronic feedback. RETRAN-02 also contains a one-dimensional (1-D) kinetics neutronic feedback model option that provides a more accurate transient power prediction than the point-kinetics model. In the past few fuel cycles, the thermal or MCPR operating margin at VY has eroded due to increases in fuel cycle length. To offset this decrease, YAEC has developed the capability to use the more accurate 1-D kinetics RETRAN option. This paper reviews the qualification effort for the YAEC BWR methods. This paper also presents a comparison between RETRAN-02 predictions using 1-D and point kinetics for the limiting transient, and demonstrates the typical gain in thermal margin from 1-D kinetics.

  15. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  16. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  17. The experience of using Endo GIA™ Radial Reload with Tri-Staple™ Technology for various lung surgery.

    PubMed

    Ema, Toshinari

    2014-10-01

    Endo GIA™ Radial Reload with Tri-Staple™ Technology (RR) is a device for colorectal surgery. However, with its rounded staple line, Radial Reload is suitable for various lung surgeries. We use the device for lung wedge resection, and cutting bronchus in lung lobectomy. The total number of use counts up to 56 fires, and all fires came out well.

  18. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  19. Myocardial Reloading After Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    PubMed Central

    Kajimoto, Masaki; O'Kelly Priddy, Colleen M.; Ledee, Dolena R.; Xu, Chun; Isern, Nancy; Olson, Aaron K.; Rosiers, Christine Des; Portman, Michael A.

    2013-01-01

    Background Extracorporeal membrane oxygenation (ECMO) unloads the heart, providing a bridge to recovery in children after myocardial stunning. ECMO also induces stress which can adversely affect the ability to reload or wean the heart from the circuit. Metabolic impairments induced by altered loading and/or stress conditions may impact weaning. However, cardiac substrate and amino acid requirements upon weaning are unknown. We assessed the hypothesis that ventricular reloading with ECMO modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Methods and Results Sixteen immature piglets (7.8 to 15.6 kg) were separated into 2 groups based on ventricular loading status: 8‐hour ECMO (UNLOAD) and postwean from ECMO (RELOAD). We infused into the coronary artery [2‐13C]‐pyruvate as an oxidative substrate and [13C6]‐L‐leucine as an indicator for amino acid oxidation and protein synthesis. Upon RELOAD, each functional parameter, which were decreased substantially by ECMO, recovered to near‐baseline level with the exclusion of minimum dP/dt. Accordingly, myocardial oxygen consumption was also increased, indicating that overall mitochondrial metabolism was reestablished. At the metabolic level, when compared to UNLOAD, RELOAD altered the contribution of various substrates/pathways to tissue pyruvate formation, favoring exogenous pyruvate versus glycolysis, and acetyl‐CoA formation, shifting away from pyruvate decarboxylation to endogenous substrate, presumably fatty acids. Furthermore, there was also a significant increase of tissue concentrations for all CAC intermediates (≈80%), suggesting enhanced anaplerosis, and of fractional protein synthesis rates (>70%). Conclusions RELOAD alters both cytosolic and mitochondrial energy substrate metabolism, while favoring leucine incorporation into protein synthesis rather than oxidation in the CAC. Improved understanding of factors governing these metabolic perturbations may

  20. Temporal changes in sarcomere lesions of rat adductor longus muscles during hindlimb reloading

    NASA Technical Reports Server (NTRS)

    Krippendorf, B. B.; Riley, D. A.

    1994-01-01

    Focal sarcomere disruptions were previously observed in adductor longus muscles of rats flown approximately two weeks aboard the Cosmos 1887 and 2044 biosatellite flights. These lesions, characterized by breakage and loss of myofilaments and Z-line streaming, resembled damage induced by unaccustomed exercise that includes eccentric contractions in which muscles lengthen as they develop tension. We hypothesized that sarcomere lesions in atrophied muscles of space flow rats were not produced in microgravity by muscle unloading but resulted from muscle reloading upon re-exposure to terrestrial gravity. To test this hypothesis, we examined temporal changes in sarcomere integrity of adductor longus muscles from rats subjected to 12.5 days of hindlimb suspension unloading and subsequent reloading by return to vivarium cages for 0, 6, 12, or 48 hours of normal weightbearing. Our ultrastructural observations suggested that muscle unloading (0 h reloading) induced myofibril misalignment associated with myofiber atrophy. Muscle reloading for 6 hours induced focal sarcomere lesions in which cross striations were abnormally widened. Such lesions were electron lucent due to extensive myofilament loss. Lesions in reloaded muscles showed rapid restructuring. By 12 hours of reloading, lesions were moderately stained foci and by 48 hours darkly stained foci in which the pattern of cross striations was indistinct at the light and electron microscopic levels. These lesions were spanned by Z-line-like electron dense filamentous material. Our findings suggest a new role for Z-line streaming in lesion restructuring: rather than an antecedent to damage, this type of Z-line streaming may be indicative of rapid, early sarcomere repair.

  1. Intracellular Ca2+ transients in mouse soleus muscle after hindlimb unloading and reloading

    NASA Technical Reports Server (NTRS)

    Ingalls, C. P.; Warren, G. L.; Armstrong, R. B.; Hamilton, S. L. (Principal Investigator)

    1999-01-01

    The objective of this study was to determine whether altered intracellular Ca(2+) handling contributes to the specific force loss in the soleus muscle after unloading and/or subsequent reloading of mouse hindlimbs. Three groups of female ICR mice were studied: 1) unloaded mice (n = 11) that were hindlimb suspended for 14 days, 2) reloaded mice (n = 10) that were returned to their cages for 1 day after 14 days of hindlimb suspension, and 3) control mice (n = 10) that had normal cage activity. Maximum isometric tetanic force (P(o)) was determined in the soleus muscle from the left hindlimb, and resting free cytosolic Ca(2+) concentration ([Ca(2+)](i)), tetanic [Ca(2+)](i), and 4-chloro-m-cresol-induced [Ca(2+)](i) were measured in the contralateral soleus muscle by confocal laser scanning microscopy. Unloading and reloading increased resting [Ca(2+)](i) above control by 36% and 24%, respectively. Although unloading reduced P(o) and specific force by 58% and 24%, respectively, compared with control mice, there was no difference in tetanic [Ca(2+)](i). P(o), specific force, and tetanic [Ca(2+)](i) were reduced by 58%, 23%, and 23%, respectively, in the reloaded animals compared with control mice; however, tetanic [Ca(2+)](i) was not different between unloaded and reloaded mice. These data indicate that although hindlimb suspension results in disturbed intracellular Ca(2+) homeostasis, changes in tetanic [Ca(2+)](i) do not contribute to force deficits. Compared with unloading, 24 h of physiological reloading in the mouse do not result in further changes in maximal strength or tetanic [Ca(2+)](i).

  2. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  3. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  4. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  5. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  6. Gaps in the Design Process

    SciTech Connect

    Veers, Paul

    2016-10-04

    The design of offshore wind plants is a relatively new field. The move into U.S. waters will have unique environmental conditions, as well as expectations from the authorities responsible for managing the development. Wind turbines are required to test their assumed design conditions with the site conditions of the plant. There are still some outstanding issues on how we can assure that the design for both the turbine and the foundation are appropriate for the site and will have an acceptable level of risk associated with the particular installation.

  7. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  8. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  9. 2D Mesoscale Simulations of Quasielastic Reloading and Unloading in Shock Compressed Aluminum

    NASA Astrophysics Data System (ADS)

    Dwivedi, S. K.

    2007-06-01

    2D mesoscale simulations of planar shock compression, followed by either reloading or unloading, are presented that predict quasi-elastic (QE) response observed experimentally in shocked polycrystalline aluminum. The representative volume element (RVE) of the plate impact experiments included a realistic representation of a grain ensemble with apparent heterogeneities in the polycrystalline sample. Simulations were carried out using a 2D updated Lagrangian finite element code ISP-TROTP incorporating elastic-plastic deformation in grain interior and contact/cohesive methodology to analyze finite strength grain boundaries. Local heterogeneous response was quantified by calculating appropriate material variables along in-situ Lagrangian tracer lines and comparing the temporal variation of their mean values with results from 2D continuum simulations. Simulations were carried out by varying a large number of individual heterogeneities to predict QE response on reloading and unloading from shock state. The heterogeneities important for simulating the QE response identified from these simulations were: hardened grain boundaries, hard inclusions, and micro-porosity. It is shown that the shock-deformed state of polycrystalline aluminum in the presence of these effects is strongly heterogeneous with considerable variations in lateral stresses. This distributed stress state unloads the shear stress from flow stress causing QE response on reloading as well as unloading. The simulated velocity profiles and calculated shear strength and shear stresses for a representative reloading and unloading experimental configuration were found to agree well with the reported experimental data. Work supported by DOE.

  10. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  11. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  12. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  13. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  14. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application.

  15. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  16. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  17. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition.

    PubMed

    Imbir, Kamil K

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples.

  18. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition

    PubMed Central

    Imbir, Kamil K.

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples. PMID:27486423

  19. IMPLEMENTING THE SAFEGUARDS-BY-DESIGN PROCESS

    SciTech Connect

    Whitaker, J Michael; McGinnis, Brent; Laughter, Mark D; Morgan, Jim; Bjornard, Trond; Bean, Robert; Durst, Phillip; Hockert, John; DeMuth, Scott; Lockwood, Dunbar

    2010-01-01

    The Safeguards-by-Design (SBD) approach incorporates safeguards into the design and construction of nuclear facilities at the very beginning of the design process. It is a systematic and structured approach for fully integrating international and national safeguards for material control and accountability (MC&A), physical protection, and other proliferation barriers into the design and construction process for nuclear facilities. Implementing SBD is primarily a project management or project coordination challenge. This paper focuses specifically on the design process; the planning, definition, organization, coordination, scheduling and interaction of the safeguards experts and stakeholders as they participate in the design and construction of a nuclear facility. It delineates the steps in a nuclear facility design and construction project in order to provide the project context within which the safeguards design activities take place, describes the involvement of the safeguards experts in the design process, the nature of their analyses, interactions and decisions, and describes the documents created and how they are used. This report highlights the project context of safeguards activities, and identifies the safeguards community (nuclear facility operator, designer/builder, state regulator, SSAC and IAEA) must accomplish in order to implement SBD within the project.

  20. An Integrated Course and Design Project in Chemical Process Design.

    ERIC Educational Resources Information Center

    Rockstraw, David A.; And Others

    1997-01-01

    Describes a chemical engineering course curriculum on process design, analysis, and simulation. Includes information regarding the sequencing of engineering design classes and the location of the classes within the degree program at New Mexico State University. Details of course content are provided. (DDR)

  1. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  2. Kinetics and Muscle Activity Patterns during Unweighting and Reloading Transition Phases in Running

    PubMed Central

    Sainton, Patrick; Nicol, Caroline; Cabri, Jan; Barthèlemy-Montfort, Joëlle; Chavet, Pascale

    2016-01-01

    Amongst reduced gravity simulators, the lower body positive pressure (LBPP) treadmill is emerging as an innovative tool for both rehabilitation and fundamental research purposes as it allows running while experiencing reduced vertical ground reaction forces. The appropriate use of such a treadmill requires an improved understanding of the associated neuromechanical changes. This study concentrates on the runner’s adjustments to LBPP-induced unweighting and reloading during running. Nine healthy males performed two running series of nine minutes at natural speed. Each series comprised three sequences of three minutes at: 100% bodyweight (BW), 60 or 80% BW, and 100% BW. The progressive unweighting and reloading transitions lasted 10 to 15 s. The LBPP-induced unweighting level, vertical ground reaction force and center of mass accelerations were analyzed together with surface electromyographic activity from 6 major lower limb muscles. The analyses of stride-to-stride adjustments during each transition established highly linear relationships between the LBPP-induced progressive changes of BW and most mechanical parameters. However, the impact peak force and the loading rate systematically presented an initial 10% increase with unweighting which could result from a passive mechanism of leg retraction. Another major insight lies in the distinct neural adjustments found amongst the recorded lower-limb muscles during the pre- and post-contact phases. The preactivation phase was characterized by an overall EMG stability, the braking phase by decreased quadriceps and soleus muscle activities, and the push-off phase by decreased activities of the shank muscles. These neural changes were mirrored during reloading. These neural adjustments can be attributed in part to the lack of visual cues on the foot touchdown. These findings highlight both the rapidity and the complexity of the neuromechanical changes associated with LBPP-induced unweighting and reloading during running

  3. Action potential duration determines sarcoplasmic reticulum Ca2+ reloading in mammalian ventricular myocytes

    PubMed Central

    Bassani, Rosana A; Altamirano, Julio; Puglisi, José L; Bers, Donald M

    2004-01-01

    After sarcoplasmic reticulum (SR) Ca2+ depletion in intact ventricular myocytes, electrical activity promotes SR Ca2+ reloading and recovery of twitch amplitude. In ferret, recovery of twitch and caffeine-induced contracture required fewer twitches than in rabbit or rat. In rat, there was no difference in action potential duration at 90% repolarization (APD90) at steady state (SS) versus at the first post-depletion (PD) twitch. The SS APD90 was similar in ferret and rabbit (but longer than in rat). However, compared to SS, the PD APD90 was lengthened in ferret, but shortened in rabbit. When rabbit myocytes were subjected to AP-clamp patterns during SR Ca2+ reloading (ferret- or rabbit-type APs), reloading was much faster using the ferret AP templates. We conclude that the faster SR Ca2+ refilling in ferret is due to the increased Ca2+ influx during the longer PD AP. The PD versus SS APD90 difference was suppressed by thapsigargin in ferret (indicating Ca2+ dependence). In rabbit, the PD AP shortening depended on the preceding diastolic interval (rather than Ca2+), because rest produced the same AP shortening, and SS APD90 increased as a function of frequency (in contrast to ferret). Transient outward current (Ito) was larger and recovered from inactivation much faster in ferret than in rabbit. Moreover, slow Ito recovery (τ ∼ 3 s) in rabbit was a much larger fraction of Ito. Our data and a computational model (including two Ito components) suggest that in rabbit the slowly recovering Ito is responsible for short post-rest and PD APs, for the unusual frequency dependence of APD90, and ultimately for the slower post-depletion SR Ca2+ reloading. PMID:15243136

  4. Osteocyte-viability-based simulations of trabecular bone loss and recovery in disuse and reloading.

    PubMed

    Wang, Hong; Ji, Baohua; Liu, X Sherry; van Oers, René F M; Guo, X Edward; Huang, Yonggang; Hwang, Keh-Chih

    2014-01-01

    Osteocyte apoptosis is known to trigger targeted bone resorption. In the present study, we developed an osteocyte-viability-based trabecular bone remodeling (OVBR) model. This novel remodeling model, combined with recent advanced simulation methods and analysis techniques, such as the element-by-element 3D finite element method and the ITS technique, was used to quantitatively study the dynamic evolution of bone mass and trabecular microstructure in response to various loading and unloading conditions. Different levels of unloading simulated the disuse condition of bed rest or microgravity in space. The amount of bone loss and microstructural deterioration correlated with the magnitude of unloading. The restoration of bone mass upon the reloading condition was achieved by thickening the remaining trabecular architecture, while the lost trabecular plates and rods could not be recovered by reloading. Compared to previous models, the predictions of bone resorption of the OVBR model are more consistent with physiological values reported from previous experiments. Whereas osteocytes suffer a lack of loading during disuse, they may suffer overloading during the reloading phase, which hampers recovery. The OVBR model is promising for quantitative studies of trabecular bone loss and microstructural deterioration of patients or astronauts during long-term bed rest or space flight and thereafter bone recovery.

  5. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  6. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  7. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  8. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  9. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy.

  10. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  11. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  12. Postseismic Reloading: A Mechanism for Temporal Clustering of Major Earthquakes on Individual Faults

    NASA Astrophysics Data System (ADS)

    Kenner, S. J.; Simons, M.

    2001-12-01

    On a single fault segment, geologic and paleoseismic evidence from locations such as the Basin and Range [Friedrich et al. JGR, submitted] and Dead Sea Transform [Marco et al., JGR, 1996] indicate that occurrence of major earthquakes in time is often extremely heterogeneous and may, in fact, exhibit temporal clustering. We consider major earthquake clustering as the occurrence of multiple event sequences with intra-cluster inter-event times much shorter than the average time between clusters. Many factors may contribute to temporal clustering of major earthquakes. Over multiple event time scales, time-dependent postseismic stress transfer may play an important role. After major earthquakes, time-varying deformation transients occur. These transients result from diffusion of stress away from zones of stress concentration generated during the coseismic rupture. As a consequence, the coseismic fault is reloaded at a rate that is initially much higher than the background rate derived from far-field plate motions. On a given fault, earthquake recurrence intervals are moderated by various sources of system noise, including stress perturbations due to neighboring earthquakes, crustal heterogeneity, and fault evolution. Depending on the relative timing and magnitude of earthquakes in a sequence, therefore, the postseismic stress available for transfer to the coseismic fault may be greater or less than average. This may lead to a situation in which postseismic stress transfer becomes a significant factor in controlling the time to the next event. To investigate these longer-term postseismic processes, we develop a spring-dashpost-slider model of time-dependent stress transfer in the earth. With this tool, we gain an understanding of how variations in rheology, fault slip-rate, and system noise affect a fault's behavior. In tectonic environments with a weak lower crust/upper mantle, we find that small random variations in the fault failure criteria generate temporally

  13. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  14. Conceptual design of industrial process displays.

    PubMed

    Pedersen, C R; Lind, M

    1999-11-01

    Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is

  15. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  16. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  17. Molecular thermodynamics for chemical process design.

    PubMed

    Prausnitz, J M

    1979-08-24

    Chemical process design requires quantitative information on the equilibrium properties of a variety of fluid mixtures. Since the experimental effort needed to provide this information is often prohibitive in cost and time, chemical engineers must utilize rational estimation techniques based on limited experimental data. The basis for such techniques is molecular thermodynamics, a synthesis of classical and statistical thermodynamics, molecular physics, and physical chemistry.

  18. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product design…

  19. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  20. A reload and startup plan for conversion of the NIST research reactor

    SciTech Connect

    D. J. Diamond

    2016-03-31

    The National Institute of Standards and Technology operates a 20 MW research reactor for neutron-based research. The heavy-water moderated and cooled reactor is fueled with high-enriched uranium (HEU) but a program to convert the reactor to low-enriched uranium (LEU) fuel is underway. Among other requirements, a reload and startup test plan must be submitted to the U.S. Nuclear Regulatory Commission (NRC) for their approval. The NRC provides guidance for what should be in the plan to ensure that the licensee has sufficient information to operate the reactor safely. Hence, a plan has been generated consisting of two parts. The reload portion of the plan specifies the fuel management whereby initially only two LEU fuel elements are in the core for eight fuel cycles. This is repeated until a point when the optimum approach is to place four fresh LEU elements into the reactor each cycle. This final transition is repeated and after eight cycles the reactor is completely fueled with LEU. By only adding two LEU fuel elements initially, the plan allows for the consumption of HEU fuel elements that are expected to be in storage at the time of conversion and provides additional qualification of production LEU fuel under actual operating conditions. Because the reload is to take place over many fuel cycles, startup tests will be done at different stages of the conversion. The tests, to be compared with calculations to show that the reactor will operate as planned, are the measurement of critical shim arm position and shim arm and regulating rod reactivity worths. An acceptance criterion for each test is specified based on technical specifications that relate to safe operation. Additional tests are being considered that have less safety significance but may be of interest to bolster the validation of analysis tools.

  1. BWR Reload Strategy Based on Fixing Once-Burnt Fuel Between Cycles

    SciTech Connect

    Maag, Elizebeth M.; Knott, Dave

    2001-12-15

    The feasibility of a reload strategy based on fixing the locations of once-burnt fuel between cycles has been evaluated for the Perry nuclear power plant (Perry). This strategy can reduce refueling shuffle critical path time by 3 days without penalty in fuel cycle economics. The scheme works well for Perry because of the extreme cycle energy requirements and the large feed batch size needed to meet those requirements. Cores requiring less energy and a smaller feed batch size have not been investigated.

  2. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  3. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  4. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  5. The design of a nanolithographic process

    NASA Astrophysics Data System (ADS)

    Johannes, Matthew Steven

    This research delineates the design of a nanolithographic process for nanometer scale surface patterning. The process involves the combination of serial atomic force microscope (AFM) based nanolithography with the parallel patterning capabilities of soft lithography. The union of these two techniques provides for a unique approach to nanoscale patterning that establishes a research knowledge base and tools for future research and prototyping. To successfully design this process a number of separate research investigations were undertaken. A custom 3-axis AFM with feedback control on three positioning axes of nanometer precision was designed in order to execute nanolithographic research. This AFM system integrates a computer aided design/computer aided manufacturing (CAD/CAM) environment to allow for the direct synthesis of nanostructures and patterns using a virtual design interface. This AFM instrument was leveraged primarily to study anodization nanolithography (ANL), a nanoscale patterning technique used to generate local surface oxide layers on metals and semiconductors. Defining research focused on the automated generation of complex oxide nanoscale patterns as directed by CAD/CAM design as well as the implementation of tip-sample current feedback control during ANL to increase oxide uniformity. Concurrently, research was conducted concerning soft lithography, primarily in microcontact printing (muCP), and pertinent experimental and analytic techniques and procedures were investigated. Due to the masking abilities of the resulting oxide patterns from ANL, the results of AFM based patterning experiments are coupled with micromachining techniques to create higher aspect ratio structures at the nanoscale. These relief structures are used as master pattern molds for polymeric stamp formation to reproduce the original in a parallel fashion using muCP stamp formation and patterning. This new method of master fabrication provides for a useful alternative to

  6. A survey of the Oyster Creek reload licensing model

    SciTech Connect

    Alammar, M.A. )

    1991-01-01

    The Oyster Creek RETRAN licensing model was submitted for approval by the U.S. Nuclear Regulatory Commission in September 1987. This paper discusses the technical issues and concerns that were raised during the review process and how they were resolved. The technical issues are grouped into three major categories: the adequacy of the model benchmark against plant data; uncertainty analysis and model convergence with respect to various critical parameters (code correlations, nodalization, time step, etc.); and model application and usage.

  7. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  8. Mimicry of natural material designs and processes

    SciTech Connect

    Bond, G.M.; Richman, R.H.; McNaughton, W.P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  9. Proxima Centauri reloaded: Unravelling the stellar noise in radial velocities

    NASA Astrophysics Data System (ADS)

    Damasso, M.; Del Sordo, F.

    2017-03-01

    Context. The detection and characterisation of Earth-like planets with Doppler signals of the order of 1 m s-1 currently represent one of the greatest challenge for extrasolar-planet hunters. As results for such findings are often controversial, it is desirable to provide independent confirmations of the discoveries. Testing different models for the suppression of non-Keplerian stellar signals usually plaguing radial velocity data is essential to ensuring findings are robust and reproducible. Aims: Using an alternative treatment of the stellar noise to that discussed in the discovery paper, we re-analyse the radial velocity dataset that led to the detection of a candidate terrestrial planet orbiting the star Proxima Centauri. We aim to confirm the existence of this outstanding planet, and test the existence of a second planetary signal. Methods: Our technique jointly modelled Keplerian signals and residual correlated signals in radial velocities using Gaussian processes. We analysed only radial velocity measurements without including other ancillary data in the fitting procedure. In a second step, we have compared our outputs with results coming from photometry, to provide a consistent physical interpretation. Our analysis was performed in a Bayesian framework to quantify the robustness of our findings. Results: We show that the correlated noise can be successfully modelled as a Gaussian process regression, and contains a periodic term modulated on the stellar rotation period and characterised by an evolutionary timescale of the order of one year. Both findings appear to be robust when compared with results obtained from archival photometry, thus providing a reliable description of the noise properties. We confirm the existence of a coherent signal described by a Keplerian orbit equation that can be attributed to the planet Proxima b, and provide an independent estimate of the planetary parameters. Our Bayesian analysis dismisses the existence of a second planetary

  10. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  11. Moral judgment reloaded: a moral dilemma validation study.

    PubMed

    Christensen, Julia F; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set.

  12. Moral judgment reloaded: a moral dilemma validation study

    PubMed Central

    Christensen, Julia F.; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K.; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set. PMID:25071621

  13. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  14. SETI reloaded: Next generation radio telescopes, transients and cognitive computing

    NASA Astrophysics Data System (ADS)

    Garrett, Michael A.

    2015-08-01

    The Search for Extra-terrestrial Intelligence (SETI) using radio telescopes is an area of research that is now more than 50 years old. Thus far, both targeted and wide-area surveys have yet to detect artificial signals from intelligent civilisations. In this paper, I argue that the incidence of co-existing intelligent and communicating civilisations is probably small in the Milky Way. While this makes successful SETI searches a very difficult pursuit indeed, the huge impact of even a single detection requires us to continue the search. A substantial increase in the overall performance of radio telescopes (and in particular future wide-field instruments such as the Square Kilometre Array - SKA), provide renewed optimism in the field. Evidence for this is already to be seen in the success of SETI researchers in acquiring observations on some of the world's most sensitive radio telescope facilities via open, peer-reviewed processes. The increasing interest in the dynamic radio sky, and our ability to detect new and rapid transient phenomena such as Fast Radio Bursts (FRB) is also greatly encouraging. While the nature of FRBs is not yet fully understood, I argue they are unlikely to be the signature of distant extra-terrestrial civilisations. As astronomers face a data avalanche on all sides, advances made in related areas such as advanced Big Data analytics, and cognitive computing are crucial to enable serendipitous discoveries to be made. In any case, as the era of the SKA fast approaches, the prospects of a SETI detection have never been better.

  15. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  16. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  17. Detrimental effects of reloading recovery on force, shortening velocity, and power of soleus muscles from hindlimb-unloaded rats.

    PubMed

    Widrick, J J; Maddalozzo, G F; Hu, H; Herron, J C; Iwaniec, U T; Turner, R T

    2008-11-01

    To better understand how atrophied muscles recover from prolonged nonweight-bearing, we studied soleus muscles (in vitro at optimal length) from female rats subjected to normal weight bearing (WB), 15 days of hindlimb unloading (HU), or 15 days HU followed by 9 days of weight bearing reloading (HU-R). HU reduced peak tetanic force (P(o)), increased maximal shortening velocity (V(max)), and lowered peak power/muscle volume. Nine days of reloading failed to improve P(o), while depressing V(max) and intrinsic power below WB levels. These functional changes appeared intracellular in origin as HU-induced reductions in soleus mass, fiber cross-sectional area, and physiological cross-sectional area were partially or completely restored by reloading. We calculated that HU-induced reductions in soleus fiber length were of sufficient magnitude to overextend sarcomeres onto the descending limb of their length-tension relationship upon the resumption of WB activity. In conclusion, the force, shortening velocity, and power deficits observed after 9 days of reloading are consistent with contraction-induced damage to the soleus. HU-induced reductions in fiber length indicate that sarcomere hyperextension upon the resumption of weight-bearing activity may be an important mechanism underlying this response.

  18. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  19. KPZ Reloaded

    NASA Astrophysics Data System (ADS)

    Gubinelli, Massimiliano; Perkowski, Nicolas

    2017-01-01

    We analyze the one-dimensional periodic Kardar-Parisi-Zhang equation in the language of paracontrolled distributions, giving an alternative viewpoint on the seminal results of Hairer. Apart from deriving a basic existence and uniqueness result for paracontrolled solutions to the KPZ equation we perform a thorough study of some related problems. We rigorously prove the links between the KPZ equation, stochastic Burgers equation, and (linear) stochastic heat equation and also the existence of solutions starting from quite irregular initial conditions. We also show that there is a natural approximation scheme for the nonlinearity in the stochastic Burgers equation. Interpreting the KPZ equation as the value function of an optimal control problem, we give a pathwise proof for the global existence of solutions and thus for the strict positivity of solutions to the stochastic heat equation. Moreover, we study Sasamoto-Spohn type discretizations of the stochastic Burgers equation and show that their limit solves the continuous Burgers equation possibly with an additional linear transport term. As an application, we give a proof of the invariance of the white noise for the stochastic Burgers equation that does not rely on the Cole-Hopf transform.

  20. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  1. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  2. Short- and Long-Term Hindlimb Immobilization and Reloading: Profile of Epigenetic Events in Gastrocnemius.

    PubMed

    Chacon-Cabrera, Alba; Gea, Joaquim; Barreiro, Esther

    2017-06-01

    Skeletal muscle dysfunction and atrophy are characteristic features accompanying chronic conditions. Epigenetic events regulate muscle mass and function maintenance. We hypothesized that the pattern of epigenetic events (muscle-enriched microRNAs and histone acetylation) and acetylation of transcription factors known to signal muscle wasting may differ between early- and late-time points in skeletal muscles of mice exposed to hindlimb immobilization (I) and recovery following I. Body and muscle weights, grip strength, muscle-enriched microRNAs, histone deacetylases (HDACs), acetylation of proteins, histones, and transcription factors (TF), myogenic TF factors, and muscle phenotype were assessed in gastrocnemius of mice exposed to periods (1, 2, 3, 7, 15, and 30 days, I groups) of hindlimb immobilization, and in those exposed to reloading for different periods of time (1, 3, 7, 15, and 30 days, R groups) following 7-day immobilization. Compared to non-immobilized controls, muscle weight, limb strength, microRNAs, especially miR-486, SIRT1 levels, and slow- and fast-twitch cross-sectional areas were decreased in mice of I groups, whereas Pax7 and acetylated FoxO1 and FoxO3 levels were increased. Muscle reloading following splint removal improved muscle mass loss, strength, and fiber atrophy, by increasing microRNAs, particularly miR-486, and SIRT1 content, while decreasing acetylated FoxO1 and FoxO3 levels. In this mouse model of disuse muscle atrophy, muscle-enriched microRNAs, especially miR-486, through Pax7 regulation delayed muscle cell differentiation following unloading of gastrocnemius muscle. Acetylation of FoxO1 and 3 seemed to drive muscle mass loss and atrophy, while deacetylation of these factors through SIRT1 would enable the muscle fibers to regenerate. J. Cell. Physiol. 232: 1415-1427, 2017. © 2016 Wiley Periodicals, Inc.

  3. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  4. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  5. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  6. Conceptual design of clean processes: Tools and methods

    SciTech Connect

    Hurme, M.

    1996-12-31

    Design tools available for implementing clean design into practice are discussed. The application areas together with the methods of comparison of clean process alternatives are presented. Environmental principles are becoming increasingly important in the whole life cycle of products from design, manufacturing and marketing to disposal. The hinder of implementing clean technology in design has been the necessity to apply it in all phases of design starting from the beginning, since it deals with the major selections made in the conceptual process design. Therefore both a modified design approach and new tools are needed for process design to make the application of clean technology practical. The first item; extended process design methodologies has been presented by Hurme, Douglas, Rossiter and Klee, Hilaly and Sikdar. The aim of this paper is to discuss the latter topic; the process design tools which assist in implementing clean principles into process design. 22 refs., 2 tabs.

  7. Mechanical Design Support System Based on Thinking Process Development Diagram

    NASA Astrophysics Data System (ADS)

    Mase, Hisao; Kinukawa, Hiroshi; Morii, Hiroshi; Nakao, Masayuki; Hatamura, Yotaro

    This paper describes a system that directly supports a design process in a mechanical domain. This system is based on a thinking process development diagram that draws distinctions between requirement, tasks, solutions, and implementation, which enables designers to expand and deepen their thoughts of design. The system provides five main functions that designers require in each phase of the proposed design process: (1) thinking process description support which enables designers to describe their thoughts, (2) creativity support by term association with thesauri, (3) timely display of design knowledge including know-how obtained through earlier failures, general design theories, standard-parts data, and past designs, (4) design problem solving support using 46 kinds of thinking operations, and (5) proper technology transfer support which accumulates not only design conclusions but also the design process. Though this system is applied to mechanical engineering as the first target domain, it can be easily expanded to many other domains such as architecture and electricity.

  8. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is…

  9. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  10. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  11. POLLUTION PREVENTION IN THE EARLY STAGES OF HIERARCHICAL PROCESS DESIGN

    EPA Science Inventory

    Hierarchical methods are often used in the conceptual stages of process design to synthesize and evaluate process alternatives. In this work, the methods of hierarchical process design will be focused on environmental aspects. In particular, the design methods will be coupled to ...

  12. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats

    NASA Astrophysics Data System (ADS)

    Riley, D. A.

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  13. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats.

    PubMed

    Riley, D A

    1998-01-01

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  14. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  15. Wireless Participant Incentives Using Reloadable Bank Cards to Increase Clinical Trial Retention With Abused Women Drinkers: A Natural Experiment.

    PubMed

    Rodgers, Melissa; Meisel, Zachary; Wiebe, Douglas; Crits-Christoph, Paul; Rhodes, Karin V

    2016-08-07

    Retaining participants in longitudinal studies is a unique methodological challenge in many areas of investigation, and specifically for researchers aiming to identify effective interventions for women experiencing intimate partner violence (IPV). Individuals in abusive relationships are often transient and have logistical, confidentiality, and safety concerns that limit future contact. A natural experiment occurred during a large randomized clinical trial enrolling women in abusive relationships who were also heavy drinkers, which allowed for the comparison of two incentive methods to promote longitudinal retention: cash payment versus reloadable wireless bank cards. In all, 600 patients were enrolled in the overall trial, which aimed to incentivize participants using a reloadable bank card system to promote the completion of 11 weekly interactive voice response system (IVRS) phone surveys and 3-, 6-, and 12-month follow-up phone or in person interviews. The first 145 participants were paid with cash as a result of logistical delays in setting up the bank card system. At 12 weeks, participants receiving the bank card incentive completed significantly more IVRS phone surveys, odds ratio (OR) = 2.4, 95% confidence interval (CI) = [0.01, 1.69]. There were no significant differences between the two groups related to satisfaction or safety and/or privacy. The bank card system delivered lower administrative burden for tracking payments for study staff. Based on these and other results, our large medical research university is implementing reloadable bank card as the preferred method of participant incentive payments.

  16. Hynol Process Engineering: Process Configuration, Site Plan, and Equipment Design

    DTIC Science & Technology

    1996-02-01

    wood, and natural gas is used as a co-feed stock. Compared with other methanol production processes, direct emissions of carbon dioxide can be...co-feedstock. Compared with other methanol production processes, direct emissions of carbon dioxide (CO 2) can be substantially reduced by using the...gas provides for reduced CO2 emissions per unit of fossil fuel carbon processed compared with separate natural gas and biomass processes. In accordance

  17. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  18. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  19. Application of Process Modeling Tools to Ship Design

    DTIC Science & Technology

    2011-05-01

    NAVSEA Frank Waldman; LATTIX May 2011 APPLICATION OF PROCESS MODELING TOOLS TO SHIP DESIGN Report Documentation Page Form ApprovedOMB No. 0704-0188...00-00-2011 4. TITLE AND SUBTITLE Application of Process Modeling Tools to Ship Design 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...design teams – Long design schedules – Complicated acquisition procedures • We are applying commercial process modeling techniques for: – Better

  20. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  1. Study on Product Innovative Design Process Driven by Ideal Solution

    NASA Astrophysics Data System (ADS)

    Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui

    Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.

  2. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  3. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  4. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  5. Glucose uptake in rat soleus - Effect of acute unloading and subsequent reloading

    NASA Technical Reports Server (NTRS)

    Henriksen, Eric J.; Tischler, Marc E.

    1988-01-01

    The effect of acutely reduced weight bearing (unloading) on the in vitro uptake of 2-1,2-H-3-deoxy-D-glucose was studied in the soleus muscle by tail casting and suspending rats. After just 4 h, the uptake of 2-deoxy-D-glucose fell (-19 percent) and declined further after an additional 20 h of unloading. This diminution at 24 h was associated with slower oxidation of C-14-glucose and incorporation of C-14-glucose into glycogen. At 3 days of unloading, basal uptake of 2-deoxy-D-glucose did not differ from control. Reloading of the soleus after 1 or 3 days of unloading increased uptake of 2-deoxy-D-glucose above control and returned it to normal within 6 h and 4 days, respectively. These effects of unloading and recovery were caused by local changes in the soleus, because the extensor digitorum longus from the same hindlimbs did not display any alterations in uptake of 2-deoxy-D-glucose or metabolism of glucose.

  6. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught…

  7. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  8. User-Centered Design (UCD) Process Description

    DTIC Science & Technology

    2014-12-01

    mockups and prototypes. CONCLUSIONS AND RECOMMENDATIONS UCD provides guidance for improving total system performance by considering the real- world...against essential story scenarios, eventually leading to the development of high-fidelity mockups and prototypes. Figure 1. User-centered design (UCD

  9. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  10. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Video Gallery

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  11. The Processes Involved in Designing Software.

    DTIC Science & Technology

    1980-08-01

    compiler’s task was to derive the sequence of IPL instructions that brought about that transformation. The other form of deflnillons was In terms of...used suitably generalized forms of means-ends analysis to generate sequences of IPL instructions that would meet the Input specifications. One branch...processes called critics are used to reorganize this more detailed plan Into an Internally consistent and efficient sequence of actions. The process

  12. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  13. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  14. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  15. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  16. Design and Processing of Electret Structures

    DTIC Science & Technology

    2009-10-31

    corrosion rate measurements in specially designed model systems (Fig. 1). The spatial resolution is determined by the optical properties of the setup and...and optical responses. Such molecules usually contain a chain of atoms forming a conjugated π- electron system with electron donor and acceptor...electrets from monodisperse polystyrene microspheres (electrets whose charge comes from an imbalance of ions, rather than from the transfer of electrons

  17. The Process of Soviet Weapons Design

    DTIC Science & Technology

    1978-03-01

    system on the BMP from an early 1940s German design. But the validity and usefulness of a theory, especially one that makes predictions about the future...when the 1940 publication of a highly significant Soviet discovery of spontaneous fission resulted in a complete lack of an American response, the...taken from I. N. Golovin , I. V. Khurchatov, Atomizdat, Moscow, 1973, and from Herbert York, The Advisors. Oppenheimer, Teller, and the Superbomb, W. H

  18. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  19. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  20. Fuel and Core Design Experiences in Cofrentes NPP

    SciTech Connect

    Garcia-Delgado, L.; Lopez-Carbonell, M.T.; Gomez-Bernal, I.

    2002-07-01

    The electricity market deregulation in Spain is increasing the need for innovations in nuclear power generation, which can be achieved in the fuel area by improving fuel and core designs and by introducing vendors competition. Iberdrola has developed the GIRALDA methodology for design and licensing of Cofrentes reloads, and has introduced mixed cores with fuel from different vendors. The application of GIRALDA is giving satisfactory results, and is showing its capability to adequately reproduce the core behaviour. The nuclear design team is acquiring an invaluable experience and a deep knowledge of the core, very useful to support cycle operation. Continuous improvements are expected for the future in design strategies as well as in the application of new technologies to redesign the methodology processes. (authors)

  1. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  2. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  3. In-design process hotspot repair using pattern matching

    NASA Astrophysics Data System (ADS)

    Jang, Daehyun; Ha, Naya; Jeon, Junsu; Kang, Jae-Hyun; Paek, Seung Weon; Choi, Hungbok; Kim, Kee Sup; Lai, Ya-Chieh; Hurat, Philippe; Luo, Wilbur

    2012-03-01

    As patterning for advanced processes becomes more challenging, designs must become more process-aware. The conventional approach of running lithography simulation on designs to detect process hotspots is prohibitive in terms of runtime for designers, and also requires the release of highly confidential process information. Therefore, a more practical approach is required to make the In-Design process-aware methodology more affordable in terms of maintenance, confidentiality, and runtime. In this study, a pattern-based approach is chosen for Process Hotspot Repair (PHR) because it accurately captures the manufacturability challenges without releasing sensitive process information. Moreover, the pattern-based approach is fast and well integrated in the design flow. Further, this type of approach is very easy to maintain and extend. Once a new process weak pattern has been discovered (caused by Chemical Mechanical Polishing (CMP), etch, lithography, and other process steps), the pattern library can be quickly and easily updated and released to check and fix subsequent designs. This paper presents the pattern matching flow and discusses its advantages. It explains how a pattern library is created from the process weak patterns found on silicon wafers. The paper also discusses the PHR flow that fixes process hotspots in a design, specifically through the use of pattern matching and routing repair.

  4. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  5. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  6. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  7. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined.

  8. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  9. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  10. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  11. An Analysis of Algorithmic Processes and Instructional Design.

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Gerlach, Vernon S.

    1986-01-01

    Describes algorithms and shows how they can be applied to the design of instructional systems by relating them to a standard information processing model. Two studies are briefly described which tested serial and parallel processing in learning and offered guidelines for designers. Future research needs are also discussed. (LRW)

  12. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  13. Programming-Free Form Conversion, Design, and Processing

    PubMed Central

    Fan, Ting-Jun; Machlin, Rona S.; Wang, Christopher P.; Chang, Ifay F.

    1990-01-01

    In this paper, we present the requirements and design considerations for programming-free form conversion, design, and processing. A set of object-oriented software tools are also presented to help users convert a paper form into an electronic form, design an electronic form, and fill in an electronic form directly on screen.

  14. The Use of Computer Graphics in the Design Process.

    ERIC Educational Resources Information Center

    Palazzi, Maria

    This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…

  15. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  16. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  17. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  18. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  19. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.

  20. Process-based design of dynamical biological systems

    PubMed Central

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-01-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered. PMID:27686219

  1. Process-based design of dynamical biological systems

    NASA Astrophysics Data System (ADS)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  2. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  3. Perspectives on the design of safer nanomaterials and manufacturing processes

    NASA Astrophysics Data System (ADS)

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-09-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles, which includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial.

  4. Process Model Construction and Optimization Using Statistical Experimental Design,

    DTIC Science & Technology

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  5. Design and Implementation of a Multimedia DBMS: Complex Query Processing

    DTIC Science & Technology

    1991-09-01

    IMPLEMENTATION OF A MULTIMEDIA DBMS: COMPLEX QUERY PROCESSING by Huseyin Aygun September 1991 Thesis Advisor Vincent Y. Lum Approved for public release...type "trace in <function name>. 31 IV. DESIGN OF COMPLEX QUERY PROCESSING In Chapter II of this thesis the general architecture of the MDBMS...data to display. More detailed information about the modification can be found in the next chapter of this thesis . Because the design for the process

  6. Cognitive Design for Learning: Cognition and Emotion in the Design Process

    ERIC Educational Resources Information Center

    Hasebrook, Joachim

    2016-01-01

    We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…

  7. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs.

  8. Ensuring competitive advantage with semantic design process management

    SciTech Connect

    Quazzani, A.; Bernard, A.; Bocquet, J.C.

    1996-12-31

    In the field of design assistance, it is important to improve records of design history and management of design process. Indeed, we propose a modelling approach of design process that focuses on representation of semantic actions. We have identified two types of actions: physical design actions focusing on the product (e.g., parameter creation, shaft dimensioning) and management actions that allow management of the process from the planning and control viewpoint (e.g., synchronization actions, a resource allocation for a task). A taxonomy of these actions has been established according to several criteria (granularity, fields of action ... ) selected in consideration of our process management interests. Linkage with objective and rationale is also discussed.

  9. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  10. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  11. Design, processing, and testing of LSI arrays for space station

    NASA Technical Reports Server (NTRS)

    Ipri, A. C.

    1976-01-01

    The applicability of a particular process for the fabrication of large scale integrated circuits is described. Test arrays were designed, built, and tested, and then utilized. A set of optimum dimensions for LSI arrays was generated. The arrays were applied to yield improvement through process innovation, and additional applications were suggested in the areas of yield prediction, yield modeling, and process reliability.

  12. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  13. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  14. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  15. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  16. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  17. Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara

    Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.

  18. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi ); Tung, Yuanki )

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  19. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  20. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  1. Designing School Accountability Systems: Towards a Framework and Process.

    ERIC Educational Resources Information Center

    Gong, Brian

    This document presents three different views of accountability to address state needs as their departments of education design, improve, or review their state accountability and reporting systems. The first of three sections presents the system-design decision process as a linear sequence of ten steps from defining the purposes of the…

  2. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  3. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  4. Applying the ID Process to the Guided Design Teaching Strategy.

    ERIC Educational Resources Information Center

    Coscarelli, William C.; White, Gregory P.

    1982-01-01

    Describes the application of the instructional development process to a teaching technique called Guided Design in a Production-Operations Management course. In Guided Design, students are self-instructed in course content and use class time to apply this knowledge to self-instruction; in-class problem-solving is stressed. (JJD)

  5. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  6. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  7. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  8. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  9. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  10. Development of the multichannel data processing ASIC design flow

    NASA Astrophysics Data System (ADS)

    Ivanov, P. Y.; Atkin, E. V.; Normanov, D. D.; Shumkin, O. V.

    2017-01-01

    In modern multichannel data processing digital systems the number of channels ranges from some hundred thousand to millions. The basis of the elemental base of these systems are ASICs. Their most important characteristics are performance, power consumption and occupied area. ASIC design is a time and labor consuming process. In order to improve performance and reduce the designing time it is proposed to supplement the standard design flow with an optimization stage of the channel parameters based on the most efficient use of chip area and power consumption.

  11. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  12. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  13. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals.

  14. System Design Support by Optimization Method Using Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  15. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  16. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  17. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  18. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  19. Design flow for implementing image processing in FPGAs

    NASA Astrophysics Data System (ADS)

    Trakalo, M.; Giles, G.

    2007-04-01

    A design flow for implementing a dynamic gamma algorithm in an FPGA is described. Real-time video processing makes enormous demands on processing resources. An FPGA solution offers some advantages over commercial video chip and DSP implementation alternatives. The traditional approach to FPGA development involves a system engineer designing, modeling and verifying an algorithm and writing a specification. A hardware engineer uses the specification as a basis for coding in VHDL and testing the algorithm in the FPGA with supporting electronics. This process is work intensive and the verification of the image processing algorithm executing on the FPGA does not occur until late in the program. The described design process allows the system engineer to design and verify a true VHDL version of the algorithm, executing in an FPGA. This process yields reduced risk and development time. The process is achieved by using Xilinx System Generator in conjunction with Simulink® from The MathWorks. System Generator is a tool that bridges the gap between the high level modeling environment and the digital world of the FPGA. System Generator is used to develop the dynamic gamma algorithm for the contrast enhancement of a candidate display product. The results of this effort are to increase the dynamic range of the displayed video, resulting in a more useful image for the user.

  20. Space Station Freedom Program preliminary design review process

    NASA Technical Reports Server (NTRS)

    Carlise, R. F.; Adair, Billy

    1989-01-01

    To conduct the Program Requirements Review of the Space Station Freedom, a Preliminary Design Review Board (PDR) has been established. The PDR will assess the preliminary design of the assembled manned base including the assembly process, the launch, and on-orbit stage configuration, the design approach, the on-orbit verification plans, supportability, reliability, safety, interfaces with the NASA infrastructure (the NSTS, TDRSS, and Ground operations) and international partners. Issues such as the coordination of a common interpretation of design requirements, coordination of interfaces, and convergence of design perspectives vs. proper allocation of resources are discussed. The impact of the resolution of the secondary ripple effect of design decisions which may cause programmatic difficulties is also addressed.

  1. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  2. Expectation changes and team characteristics in a participatory design process.

    PubMed

    Bazley, Conne Mara; De Jong, Annelise; Vink, Peter

    2012-01-01

    A human factors specialist researched the expectations of a culturally and professionally diverse team throughout a year long participatory design process of a large processing facility. For a deeper understanding of high-level team expectations and characteristics, the specialist collected data and information through in-situ ethnography and traditional case study methods, personal interviews, and a questionnaire that included a likert scale rating for expectation levels. Results found that expectation levels rated extremely satisfied for individual team members and the overall team itself before and during the participatory process. In contrast, expectations for upper management from the team were satisfied before the participatory process, but changed to uncertain, to unsatisfied, to extremely unsatisfied during the process. Additionally, the participatory design team exhibited high-level team characteristics to include honesty, competence, commitment, communication, creativity, and clear expectations.

  3. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  4. Conceptual process designs: Lurgi-Ruhrgas and superior circular grate

    SciTech Connect

    Not Available

    1980-09-01

    Based on preliminary data on retort yields and previous conceptual designs, a Design Basis has been prepared, and an upgrading scheme developed against which all five retorting processes can be evaluated. Licensors for retorting technologies (American Lurgi Corporation for the Lurgi retort, Davy-McKee for the Superior Circular Grate retort, and Union Oil Company for the Union B retort); hydrotreaters (Gulf Oil and Chevron); wastewater treatment units (Chevron); and sulfur plants (Parsons) have been contacted for data related to their processes. Preliminary balances for the Lurgi and Superior processes have been developed and will be compared against the vendor information when received. A preliminary design basis is presented. This presents design assumptions and conditions to be used in developing the process designs, heat and material balances, and process flow diagrams for all cases. The shale oil upgrading scheme selected to be used in all evaluations consists of delayed coking the 850/sup 0/F plus fraction from the shale oil, and hydrotreating all virgin and coker naphthas and gas oils in separate hydrotreaters. This scheme was selected because it is simple and each of the units has proven to be reliable in refining conventional crude oils. Also, this upgrading scheme is not expected to penalize any specific retort system. The material and utility balances, along with process flow diagrams for Case I, the Lurgi-Ruhrgas process are given. In this case, 46,500 bpsd of 29.4 /sup 0/API upgraded shale oil are produced. The Superior Circular Grate material and utility balances and process flow diagrams are also given. The liquid product from this case is 40,500 bpsd of 27.4 /sup 0/API upgraded shale oil.

  5. Prodrugs design based on inter- and intramolecular chemical processes.

    PubMed

    Karaman, Rafik

    2013-12-01

    This review provides the reader a concise overview of the majority of prodrug approaches with the emphasis on the modern approaches to prodrug design. The chemical approach catalyzed by metabolic enzymes which is considered as widely used among all other approaches to minimize the undesirable drug physicochemical properties is discussed. Part of this review will shed light on the use of molecular orbital methods such as DFT, semiempirical and ab initio for the design of novel prodrugs. This novel prodrug approach implies prodrug design based on enzyme models that were utilized for mimicking enzyme catalysis. The computational approach exploited for the prodrug design involves molecular orbital and molecular mechanics (DFT, ab initio, and MM2) calculations and correlations between experimental and calculated values of intramolecular processes that were experimentally studied to assign the factors determining the reaction rates in certain processes for better understanding on how enzymes might exert their extraordinary catalysis.

  6. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  7. Process-induced bias: a study of resist design, device node, illumination conditions, and process implications

    NASA Astrophysics Data System (ADS)

    Carcasi, Michael; Scheer, Steven; Fonseca, Carlos; Shibata, Tsuyoshi; Kosugi, Hitoshi; Kondo, Yoshihiro; Saito, Takashi

    2009-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer systematic variations, compensation by exposure dose and/or post exposure bake (PEB) temperature have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In one previous study limited to a single resist and minimal coater/developer and scanner variations, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, it was found that using PEB temperature to control CD across wafer was preferable to using dose compensation. In another previous study, the impact of resist design was explored to understand how resist design, as well as coater/developer and scanner processing, impact process induced bias (PIB). The previous PIB studies were limited to a single illumination case and explore the effect of PIB on only L/S structures. It is the goal of this work to understand additionally how illumination design and mask design, as well as resist design and coater/developer and scanner processing, impact process induced bias (PIB)/OPC integrity.

  8. The design of bearing processing technology and fixture

    NASA Astrophysics Data System (ADS)

    Liu, Sibo

    2017-03-01

    This paper is designed for bearing processing technology and fixture. The main task is to work out the half fine milling under 36mm, Φ18 holes, bearing the processing technology of the rules of procedure, and write CARDS. Its parts are casting, which is small and of simple structure. Moreover, the components of the hole processing is higher than that of the surface, so the processing order of surface first is taken. The fixture special jig is adopted in each working procedure, among which in a drill Φ18 holes, the hydraulic clamping is used, which is simple, convenient and can meet the requirements.

  9. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  10. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  11. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  12. Time-Course of Muscle Mass Loss, Damage, and Proteolysis in Gastrocnemius following Unloading and Reloading: Implications in Chronic Diseases

    PubMed Central

    Chacon-Cabrera, Alba; Lund-Palau, Helena; Gea, Joaquim; Barreiro, Esther

    2016-01-01

    Background Disuse muscle atrophy is a major comorbidity in patients with chronic diseases including cancer. We sought to explore the kinetics of molecular mechanisms shown to be involved in muscle mass loss throughout time in a mouse model of disuse muscle atrophy and recovery following immobilization. Methods Body and muscle weights, grip strength, muscle phenotype (fiber type composition and morphometry and muscle structural alterations), proteolysis, contractile proteins, systemic troponin I, and mitochondrial content were assessed in gastrocnemius of mice exposed to periods (1, 2, 3, 7, 15 and 30 days) of non-invasive hindlimb immobilization (plastic splint, I cohorts) and in those exposed to reloading for different time-points (1, 3, 7, 15, and 30 days, R cohorts) following a seven-day period of immobilization. Groups of control animals were also used. Results Compared to non-exposed controls, muscle weight, limb strength, slow- and fast-twitch cross-sectional areas, mtDNA/nDNA, and myosin content were decreased in mice of I cohorts, whereas tyrosine release, ubiquitin-proteasome activity, muscle injury and systemic troponin I levels were increased. Gastrocnemius reloading following splint removal improved muscle mass loss, strength, fiber atrophy, injury, myosin content, and mtDNA/nDNA, while reducing ubiquitin-proteasome activity and proteolysis. Conclusions A consistent program of molecular and cellular events leading to reduced gastrocnemius muscle mass and mitochondrial content and reduced strength, enhanced proteolysis, and injury, was seen in this non-invasive mouse model of disuse muscle atrophy. Unloading of the muscle following removal of the splint significantly improved the alterations seen during unloading, characterized by a specific kinetic profile of molecular events involved in muscle regeneration. These findings have implications in patients with chronic diseases including cancer in whom physical activity may be severely compromised. PMID

  13. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  14. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  15. Rethinking behavioral health processes by using design for six sigma.

    PubMed

    Lucas, Anthony G; Primus, Kelly; Kovach, Jamison V; Fredendall, Lawrence D

    2015-02-01

    Clinical evidence-based practices are strongly encouraged and commonly utilized in the behavioral health community. However, evidence-based practices that are related to quality improvement processes, such as Design for Six Sigma, are often not used in behavioral health care. This column describes the unique partnership formed between a behavioral health care provider in the greater Pittsburgh area, a nonprofit oversight and monitoring agency for behavioral health services, and academic researchers. The authors detail how the partnership used the multistep process outlined in Design for Six Sigma to completely redesign the provider's intake process. Implementation of the redesigned process increased access to care, decreased bad debt and uncollected funds, and improved cash flow--while consumer satisfaction remained high.

  16. Visualization System Requirements for Data Processing Pipeline Design and Optimization.

    PubMed

    von Landesberger, Tatiana; Fellner, Dieter; Ruddle, Roy

    2016-08-25

    The rising quantity and complexity of data creates a need to design and optimize data processing pipelines - the set of data processing steps, parameters and algorithms that perform operations on the data. Visualization can support this process but, although there are many examples of systems for visual parameter analysis, there remains a need to systematically assess users' requirements and match those requirements to exemplar visualization methods. This article presents a new characterization of the requirements for pipeline design and optimization. This characterization is based on both a review of the literature and first-hand assessment of eight application case studies. We also match these requirements with exemplar functionality provided by existing visualization tools. Thus, we provide end-users and visualization developers with a way of identifying functionality that addresses data processing problems in an application. We also identify seven future challenges for visualization research that are not met by the capabilities of today's systems.

  17. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  18. Design of self-processing antimicrobial peptides for plant protection.

    PubMed

    Powell, W A; Catranis, C M; Maynard, C A

    2000-08-01

    Small antimicrobial peptides are excellent candidates for inclusion in self-processing proteins that could be used to confer pathogen resistance in transgenic plants. Antimicrobial peptides as small as 22 amino acids in length have been designed to incorporate the residual amino acids left from protein processing by the tobacco etch virus'(TEVs') NIa protease. Also, by minimizing the length of these peptides and the number of highly hydrophobic residues, haemolytic activity was reduced without affecting the peptide's antimicrobial activity.

  19. Motivating the Notion of Generic Design within Information Processing Theory: The Design Problem Space.

    ERIC Educational Resources Information Center

    Goel, Vinod; Pirolli, Peter

    The notion of generic design, while it has been around for 25 years, is not often articulated, especially within Newell and Simon's (1972) Information Processing Theory framework. Design is merely lumped in with other forms of problem solving activity. Intuitively it is felt that there should be a level of description of the phenomenon which…

  20. Which Events Can Cause Iteration in Instructional Design? An Empirical Study of the Design Process

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2006-01-01

    Instructional design is not a linear process: designers have to weigh the advantages and disadvantages of alternative solutions, taking into account different kinds of conflicting and changing constraints. To make sure that they eventually choose the most optimal one, they have to keep on collecting information, reconsidering continuously whether…

  1. Design Considerations for the Construction and Operation of Flour Milling Facilities. Part II: Process Design Considerations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Flour milling facilities have been the cornerstone of agricultural processing for centuries. Like most agri-industrial production facilities, flour milling facilities have a number of unique design requirements. Design information, to date, has been limited. In an effort to summarize state of the ...

  2. Optimization of Forming Processes in Microstructure Sensitive Design

    NASA Astrophysics Data System (ADS)

    Garmestani, H.; Li, D. S.

    2004-06-01

    Optimization of the forming processes from initial microstructures of raw materials to desired microstructures of final products is an important topic in materials design. Processing path model proposed in this study gives an explicit mathematical solution about how the microstructure evolves during thermomechanical processing. Based on a conservation principle in the orientation space (originally proposed by Bunge), this methodology is independent of the underlying deformation mechanisms. The evolutions of texture coefficients are modeled using a texture evolution matrix calculated from the experimental results. For the same material using the same processing method, the texture evolution matrix is the same. It does not change with the initial texture. This processing path model provides functions of processing paths and streamlines.

  3. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  4. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  5. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  6. Noise control, sound, and the vehicle design process

    NASA Astrophysics Data System (ADS)

    Donavan, Paul

    2005-09-01

    For many products, noise and sound are viewed as necessary evils that need to be dealt with in order to bring the product successfully to market. They are generally not product ``exciters'' although some vehicle manufacturers do tune and advertise specific sounds to enhance the perception of their products. In this paper, influencing the design process for the ``evils,'' such as wind noise and road noise, are considered in more detail. There are three ingredients to successfully dealing with the evils in the design process. The first of these is knowing how excesses in noise effects the end customer in a tangible manner and how that effects customer satisfaction and ultimately sells. The second is having and delivering the knowledge of what is required of the design to achieve a satisfactory or even better level of noise performance. The third ingredient is having the commitment of the designers to incorporate the knowledge into their part, subsystem or system. In this paper, the elements of each of these ingredients are discussed in some detail and the attributes of a successful design process are enumerated.

  7. The Role of Dialogic Processes in Designing Career Expectations

    ERIC Educational Resources Information Center

    Bangali, Marcelline; Guichard, Jean

    2012-01-01

    This article examines the role played by dialogic processes in the designing or redesigning of future expectations during a career guidance intervention. It discusses a specific method ("Giving instruction to a double") developed and used during career counseling sessions with two recent doctoral graduates. It intends both to help them outline or…

  8. Ingenuity in Action: Connecting Tinkering to Engineering Design Processes

    ERIC Educational Resources Information Center

    Wang, Jennifer; Werner-Avidon, Maia; Newton, Lisa; Randol, Scott; Smith, Brooke; Walker, Gretchen

    2013-01-01

    The Lawrence Hall of Science, a science center, seeks to replicate real-world engineering at the "Ingenuity in Action" exhibit, which consists of three open-ended challenges. These problems encourage children to engage in engineering design processes and problem-solving techniques through tinkering. We observed and interviewed 112…

  9. USING GENETIC ALGORITHMS TO DESIGN ENVIRONMENTALLY FRIENDLY PROCESSES

    EPA Science Inventory

    Genetic algorithm calculations are applied to the design of chemical processes to achieve improvements in environmental and economic performance. By finding the set of Pareto (i.e., non-dominated) solutions one can see how different objectives, such as environmental and economic ...

  10. Developing 21st Century Process Skills through Project Design

    ERIC Educational Resources Information Center

    Yoo, Jeong-Ju; MacDonald, Nora M.

    2014-01-01

    The goal of this paper is to illustrate how the promotion of 21st Century process skills can be used to enhance student learning and workplace skill development: thinking, problem solving, collaboration, communication, leadership, and management. As an illustrative case, fashion merchandising and design students conducted research for a…

  11. INCORPORATING INDUSTRIAL ECOLOGY INTO HIERARCHICAL CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    Incorporating Industrial Ecology into Hierarchical Chemical Process Design: Determining Targets for the Exchange of Waste

    The exchange of waste to be used as a recycled feed has long been encouraged by practitioners of industrial ecology. Industrial ecology is a field t...

  12. GREENING OF OXIDATION CATALYSIS THROUGH IMPROVED CATALYST AND PROCESS DESIGN

    EPA Science Inventory


    Greening of Oxidation Catalysis Through Improved Catalysts and Process Design
    Michael A. Gonzalez*, Thomas Becker, and Raymond Smith

    United State Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory, 26 W...

  13. Examining Teacher Thinking: Constructing a Process to Design Curricular Adaptations.

    ERIC Educational Resources Information Center

    Udvari-Solner, Alice

    1996-01-01

    This description of a curricular adaptation decision-making process focuses on tenets of reflective practice as teachers design instruction for students in heterogeneous classrooms. A case example illustrates how an elementary teaching team transformed lessons to accommodate a wide range of learners in a multiage first- and second-grade classroom.…

  14. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  15. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    ERIC Educational Resources Information Center

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  16. Experiential Learning: A Course Design Process for Critical Thinking

    ERIC Educational Resources Information Center

    Hamilton, Janet G.; Klebba, Joanne M.

    2011-01-01

    This article describes a course design process to improve the effectiveness of using experiential learning techniques to foster critical thinking skills. The authors examine prior research to identify essential dimensions of experiential learning in relation to higher order thinking. These dimensions provide key insights for the selection of…

  17. A Process Chart to Design Experiential Learning Projects

    ERIC Educational Resources Information Center

    Zhu, Suning; Wu, Yun; Sankar, Chetan S.

    2016-01-01

    A high-impact practice is to incorporate experiential learning projects when teaching difficulty subject matters so as to enhance students' understanding and interest in the course content. But, there is limited research on how to design and execute such projects. Therefore, we propose a framework based on the processes described by the Project…

  18. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Aproximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use tis inform...

  19. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  20. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Approximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use this infor...

  1. Design Exploration of Engineered Materials, Products, and Associated Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Shukla, Rishabh; Kulkarni, Nagesh H.; Gautham, B. P.; Singh, Amarendra K.; Mistree, Farrokh; Allen, Janet K.; Panchal, Jitesh H.

    2015-01-01

    In the past few years, ICME-related research has been directed towards the study of multi-scale materials design. However, relatively little has been reported on model-based methods that are of relevance to industry for the realization of engineered materials, products, and associated industrial manufacturing processes. Computational models used in the realization of engineered materials and products are fraught with uncertainty, have different levels of fidelity, are incomplete and are even likely to be inaccurate. In light of this, we adopt a robust design strategy that facilitates the exploration of the solution space thereby providing decision support to a design engineer. In this paper, we describe a foundational construct embodied in our method for design exploration, namely, the compromise Decision Support Problem. We introduce a problem that we are using to establish the efficacy of our method. It involves the integrated design of steel and gears, traversing the chain of steel making, mill production, and evolution of the material during these processes, and linking this to the mechanical design and manufacture of the gear. We provide an overview of our method to determine the operating set points for the ladle, tundish and caster operations necessary to manufacture steel of a desired set of properties. Finally, we highlight the efficacy of our method.

  2. Design characteristics for facilities which process hazardous particulate

    SciTech Connect

    Abeln, S.P.; Creek, K.; Salisbury, S.

    1998-12-01

    Los Alamos National Laboratory is establishing a research and processing capability for beryllium. The unique properties of beryllium, including light weight, rigidity, thermal conductivity, heat capacity, and nuclear properties make it critical to a number of US defense and aerospace programs. Concomitant with the unique engineering properties are the health hazards associated with processing beryllium in a particulate form and the potential for worker inhalation of aerosolized beryllium. Beryllium has the lowest airborne standard for worker protection compared to all other nonradioactive metals by more than an order of magnitude. This paper describes the design characteristics of the new beryllium facility at Los Alamos as they relate to protection of the workforce. Design characteristics to be reviewed include; facility layout, support systems to minimize aerosol exposure and spread, and detailed review of the ventilation system design for general room air cleanliness and extraction of particulate at the source.

  3. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature.

  4. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    PubMed

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  5. Analog integrated circuits design for processing physiological signals.

    PubMed

    Li, Yan; Poon, Carmen C Y; Zhang, Yuan-Ting

    2010-01-01

    Analog integrated circuits (ICs) designed for processing physiological signals are important building blocks of wearable and implantable medical devices used for health monitoring or restoring lost body functions. Due to the nature of physiological signals and the corresponding application scenarios, the ICs designed for these applications should have low power consumption, low cutoff frequency, and low input-referred noise. In this paper, techniques for designing the analog front-end circuits with these three characteristics will be reviewed, including subthreshold circuits, bulk-driven MOSFETs, floating gate MOSFETs, and log-domain circuits to reduce power consumption; methods for designing fully integrated low cutoff frequency circuits; as well as chopper stabilization (CHS) and other techniques that can be used to achieve a high signal-to-noise performance. Novel applications using these techniques will also be discussed.

  6. Optimal design of upstream processes in biotransformation technologies.

    PubMed

    Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K

    2017-01-01

    In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology.

  7. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  8. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  9. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  10. The Gains Design Process: How to do Structured Design of User Interfaces in Any Software Environment

    NASA Astrophysics Data System (ADS)

    Lindeman, Martha J.

    This paper describes a user-interaction design process created and used by a consultant to solve two challenges: (1) how to decrease the need for changes in the user interface by subsequent system releases without doing big design up-front and (2) how to apply a structured user-interaction design process no matter when brought into a project or what software methodology was being used. The four design levels in the process parallel Beck and Fowler’s four planning levels described in their book Planning Extreme Programming. The design process is called “GAINS” because the user-interaction designer has only Attraction, Information and Navigation to connect users’ Goals with the project sponsors’ criteria for Success. Thus there are five questions, one for each letter of the acronym GAINS, asked at each of four levels of design: The first two design levels, Rough Plan and Big Plan, focus on business-process actions and objects that define users’ goals. The next two levels, Release Planning and Iteration Planning, focus on the user interface objects that support the tasks necessary to achieve those goals. Release Planning identifies the displays the user sees for each goal included in that release, and also the across-display navigation for the proposed functionality. Iteration Planning focuses at a lower level of interaction, such as the within-display navigation among ontrols. For a voice system, the word “sees” would be changed to “hears,” but the design rocess and the levels of focus are the same for user interfaces that are vision output (e.g., GUIs), voice output (e.g., VRs), or multimodal.

  11. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  12. Process-induced bias: a study of resist design and process implications

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Scheer, Steven; Carcasi, Michael; Shibata, Tsuyoshi; Otsuka, Takahisa

    2008-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer variations, compensation by exposure dose and/or PEB temperature, have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In a previous study, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, both metrics demonstrated that using PEB temperature to control across wafer CD variation was preferable to using dose compensation. The previous study was limited to a single resist and variations to track and scanner processing were kept to a minimum. Further examination of additional resist materials has indicated that significant variation in dose and PEB temperature induced CD biases exist from material to material. It is the goal of this work to understand how resist design, as well as track and scanner processing, impact process induced bias (PIB). This is accomplished by analyzing full resist models for a range of resists that exhibit different dose and PEB temperature PIB behavior. From these models, the primary resist design contributors to PIB are isolated. A sensitivity analysis of the primary resist design as well as track and scanner processing effects will also be simulated and presented.

  13. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  14. A Review of the Design Process for Implantable Orthopedic Medical Devices

    PubMed Central

    Aitchison, G.A; Hukins, D.W.L; Parry, J.J; Shepherd, D.E.T; Trotman, S.G

    2009-01-01

    The design process for medical devices is highly regulated to ensure the safety of patients. This paper will present a review of the design process for implantable orthopedic medical devices. It will cover the main stages of feasibility, design reviews, design, design verification, manufacture, design validation, design transfer and design changes. PMID:19662153

  15. CHO gene expression profiling in biopharmaceutical process analysis and design.

    PubMed

    Schaub, Jochen; Clemens, Christoph; Schorn, Peter; Hildebrandt, Tobias; Rust, Werner; Mennerich, Detlev; Kaufmann, Hitto; Schulz, Torsten W

    2010-02-01

    Increase in both productivity and product yields in biopharmaceutical process development with recombinant protein producing mammalian cells can be mainly attributed to the advancements in cell line development, media, and process optimization. Only recently, genome-scale technologies enable a system-level analysis to elucidate the complex biomolecular basis of protein production in mammalian cells promising an increased process understanding and the deduction of knowledge-based approaches for further process optimization. Here, the use of gene expression profiling for the analysis of a low titer (LT) and high titer (HT) fed batch process using the same IgG producing CHO cell line was investigated. We found that gene expression (i) significantly differed in HT versus LT process conditions due to differences in applied chemically defined, serum-free media, (ii) changed over the time course of the fed batch processes, and that (iii) both metabolic pathways and 14 biological functions such as cellular growth or cell death were affected. Furthermore, detailed analysis of metabolism in a standard process format revealed the potential use of transcriptomics for rational media design as is shown for the case of lipid metabolism where the product titer could be increased by about 20% based on a lipid modified basal medium. The results demonstrate that gene expression profiling can be an important tool for mammalian biopharmaceutical process analysis and optimization.

  16. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  17. A Taguchi study of the aeroelastic tailoring design process

    NASA Technical Reports Server (NTRS)

    Bohlmann, Jonathan D.; Scott, Robert C.

    1991-01-01

    A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.

  18. Operation and design of selected industrial process heat field tests

    SciTech Connect

    Kearney, D. W.

    1981-02-01

    The DOE program of solar industrial process heat field tests has shown solar energy to be compatible with numerous industrial needs. Both the operational projects and the detailed designs of systems that are not yet operational have resulted in valuable insights into design and hardware practice. Typical of these insights are the experiences discussed for the four projects reviewed. Future solar IPH systems should benefit greatly not only from the availability of present information, but also from the wealth of operating experience from projects due to start up in 1981.

  19. Remote Maintenance Design Guide for Compact Processing Units

    SciTech Connect

    Draper, J.V.

    2000-07-13

    Oak Ridge National Laboratory (ORNL) Robotics and Process Systems (RPSD) personnel have extensive experience working with remotely operated and maintained systems. These systems require expert knowledge in teleoperation, human factors, telerobotics, and other robotic devices so that remote equipment may be manipulated, operated, serviced, surveyed, and moved about in a hazardous environment. The RPSD staff has a wealth of experience in this area, including knowledge in the broad topics of human factors, modular electronics, modular mechanical systems, hardware design, and specialized tooling. Examples of projects that illustrate and highlight RPSD's unique experience in remote systems design and application include the following: (1) design of a remote shear and remote dissolver systems in support of U.S. Department of Energy (DOE) fuel recycling research and nuclear power missions; (2) building remotely operated mobile systems for metrology and characterizing hazardous facilities in support of remote operations within those facilities; (3) construction of modular robotic arms, including the Laboratory Telerobotic Manipulator, which was designed for the National Aeronautics and Space Administration (NASA) and the Advanced ServoManipulator, which was designed for the DOE; (4) design of remotely operated laboratories, including chemical analysis and biochemical processing laboratories; (5) construction of remote systems for environmental clean up and characterization, including underwater, buried waste, underground storage tank (UST) and decontamination and dismantlement (D&D) applications. Remote maintenance has played a significant role in fuel reprocessing because of combined chemical and radiological contamination. Furthermore, remote maintenance is expected to play a strong role in future waste remediation. The compact processing units (CPUs) being designed for use in underground waste storage tank remediation are examples of improvements in systems processing

  20. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  1. Design and implementation of a distributed Complex Event Processing system

    NASA Astrophysics Data System (ADS)

    Li, Yan; Shang, Yanlei

    2017-01-01

    Making use of the massive events from event sources such as sensors and bank transactions and extract valuable information is of significant importance. Complex Event Processing (CEP), a method of detecting complex events from simple events stream, provides a solution of processing data in real time fast and efficiently. However, a single node CEP system can't satisfy requirements of processing massive event streams from multitudinous event sources. Therefore, this article designs a distributed CEP system, which combine Siddhi, a CEP engine, and Storm, a distributed real time computation architecture. This system can construct topology automatically based on the event streams and execution plans provided by users and process the event streams parallel. Compared with single node complex event system, the distributed system can achieve better performance.

  2. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  3. Applications of fault tree analysis to the design process

    NASA Astrophysics Data System (ADS)

    Youngblood, R. W.

    1988-07-01

    Fault tree analysis of a system can provide a complete characterization of system failure modes, i.e., what combinations of component failures can give rise to system failure. This can be applied to the design process at several levels: (1) confirmatory analysis, in which a fault tree development is used to verify design adequacy, (2) importance analysis, in which fault tree analysis is used to highlight system vulnerabilities, and (3) design optimization, in which fault tree analysis is used to pick the least expensive configuration from a collection of possibilities satisfying a given constraint. Experience shows that the complexity of real systems warrants the systematic and structured development of fault trees for systems whose failure can have severe consequences.

  4. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-03-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  5. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-01-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  6. Epigallocatechin-3-gallate increases autophagy signaling in resting and unloaded plantaris muscles but selectively suppresses autophagy protein abundance in reloaded muscles of aged rats.

    PubMed

    Takahashi, Hideyuki; Suzuki, Yutaka; Mohamed, Junaith S; Gotoh, Takafumi; Pereira, Suzette L; Alway, Stephen E

    2017-03-07

    We have previously found that Epigallocatechin-3-gallate (EGCg), an abundant catechin in green tea, reduced apoptotic signaling and improved muscle recovery in response to reloading after hindlimb suspension (HS). In this study, we investigated if EGCg altered autophagy signaling in skeletal muscle of old rats in response to HS or reloading after HS. Fischer 344×Brown Norway inbred rats (age 34months) were given 1ml/day of purified EGCg (50mg/kg body weight), or the same sample volume of the vehicle by gavage. One group of animals received HS for 14days and the second group of rats received 14days of HS, then the HS was removed and they were allowed to recover by ambulating normally around the cage for two weeks. EGCg decreased a small number of autophagy genes in control muscles, but it increased the expression of other autophagy genes (e.g., ATG16L2, SNCA, TM9SF1, Pink1, PIM-2) and HS did not attenuate these increases. HS increased Beclin1, ATG7 and LC3-II/I protein abundance in hindlimb muscles. Relative to vehicle treatment, EGCg treatment had greater ATG12 protein abundance (35.8%, P<0.05), but decreased Beclin1 protein levels (-101.1%, P<0.05) after HS. However, in reloaded muscles, EGCg suppressed Beclin1 and LC3-II/I protein abundance as compared to vehicle treated muscles. EGCg appeared to "prime" autophagy signaling before and enhance autophagy gene expression and protein levels during unloading in muscles of aged rats, perhaps to improve the clearance of damaged organelles. However, EGCg suppressed autophagy signaling after reloading, potentially to increase the recovery of hindlimb muscles mass and function after loading is restored.

  7. Improving Tools and Processes in Mechanical Design Collaboration

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2009-01-01

    Cooperative product development projects in the aerospace and defense industry are held hostage to high cost and risk due to poor alignment of collaborative design tools and processes. This impasse can be broken if companies will jointly develop implementation approaches and practices in support of high value working arrangements. The current tools can be used to better advantage in many situations and there is reason for optimism that tool vendors will provide significant support.

  8. Energy codes and the building design process: Opportunities for improvement

    SciTech Connect

    Sandahl, L.J.; Shankle, D.L.; Rigler, E.J.

    1994-05-01

    The Energy Policy Act (EPAct), passed by Congress in 1992, requires states to adopt building energy codes for new commercial buildings that meet or exceed the American Society of Heating, Refrigerating, and Air Conditioning Engineers (ASHRAE) and Illuminating Engineers Society of North America (IES) Standard 90.1-1989 by October 24, 1994. In response to EPAct many states will be adopting a state-wide energy code for the first time. Understanding the role of stakeholders in the building design process is key to the successful implementation of these codes. In 1993, the Pacific Northwest Laboratory (PNL) conducted a survey of architects and designers to determine how much they know about energy codes, to what extent energy-efficiency concerns influence the design process, and how they convey information about energy-efficient designs and products to their clients. Findings of the PNL survey, together with related information from a survey by the American Institute of Architects (AIA) and other reports, are presented in this report. This information may be helpful for state and utility energy program managers and others who will be involved in promoting the adoption and implementation of state energy codes that meet the requirements of EPAct.

  9. High performance cluster system design for remote sensing data processing

    NASA Astrophysics Data System (ADS)

    Shi, Yuanli; Shen, Wenming; Xiong, Wencheng; Fu, Zhuo; Xiao, Rulin

    2012-10-01

    During recent years, cluster systems have played a more important role in the architecture design of high-performance computing area which is cost-effective and efficient parallel computing system able to satisfy specific computational requirements in the earth and space sciences communities. This paper presents a powerful cluster system built by Satellite Environment Center, Ministry of Environment Protection of China that is designed to process massive remote sensing data of HJ-1 satellites automatically everyday. The architecture of this cluster system including hardware device layer, network layer, OS/FS layer, middleware layer and application layer have been given. To verify the performance of our cluster system, image registration has been chose to experiment with one scene of HJ-1 CCD sensor. The experiments of imagery registration shows that it is an effective system to improve the efficiency of data processing, which could provide a response rapidly in applications that certainly demand, such as wild land fire monitoring and tracking, oil spill monitoring, military target detection, etc. Further work would focus on the comprehensive parallel design and implementations of remote sensing data processing.

  10. Space Station Freedom pressurized element interior design process

    NASA Technical Reports Server (NTRS)

    Hopson, George D.; Aaron, John; Grant, Richard L.

    1990-01-01

    The process used to develop the on-orbit working and living environment of the Space Station Freedom has some very unique constraints and conditions to satisfy. The goal is to provide maximum efficiency and utilization of the available space, in on-orbit, zero G conditions that establishes a comfortable, productive, and safe working environment for the crew. The Space Station Freedom on-orbit living and working space can be divided into support for three major functions: (1) operations, maintenance, and management of the station; (2) conduct of experiments, both directly in the laboratories and remotely for experiments outside the pressurized environment; and (3) crew related functions for food preparation, housekeeping, storage, personal hygiene, health maintenance, zero G environment conditioning, and individual privacy, and rest. The process used to implement these functions, the major requirements driving the design, unique considerations and constraints that influence the design, and summaries of the analysis performed to establish the current configurations are described. Sketches and pictures showing the layout and internal arrangement of the Nodes, U.S. Laboratory and Habitation modules identify the current design relationships of the common and unique station housekeeping subsystems. The crew facilities, work stations, food preparation and eating areas (galley and wardroom), and exercise/health maintenance configurations, waste management and personal hygiene area configuration are shown. U.S. Laboratory experiment facilities and maintenance work areas planned to support the wide variety and mixtures of life science and materials processing payloads are described.

  11. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  12. Development of the Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Gruber, Christopher R.

    2004-01-01

    The aerodynamic development of an engine inlet requires a comprehensive program of both wind tunnel testing and Computational Fluid Dynamics (CFD) simulations. To save time and resources, much "testing" is done using CFD before any design ever enters a wind tunnel. The focus of my project this summer is on CFD analysis tool development. In particular, I am working to further develop the capabilities of the Planar Inlet Design and Analysis Process (PINDAP). "PINDAP" is a collection of computational tools that allow for efficient and accurate design and analysis of the aerodynamics about and through inlets that can make use of a planar (two-dimensional or axisymmetric) geometric and flow assumption. PINDAP utilizes the WIND CFD flow solver, which is capable of simulating the turbulent, compressible flow field. My project this summer is a continuation of work that I performed for two previous summers. Two years ago, I used basic features of the PINDAP to design a Mach 5 hypersonic scramjet engine inlet and to demonstrate the feasibility of the PINDAP. The following summer, I worked to develop its geometry and grid generation capabilities to include subsonic and supersonic inlets, complete bodies and cowls, conic leading and trailing edges, as well as airfoils. These additions allowed for much more design flexibility when using the program.

  13. System Design For A Dental Image Processing System

    NASA Astrophysics Data System (ADS)

    Cady, Fredrick M.; Stover, John C.; Senecal, William J.

    1988-12-01

    An image processing system for a large clinic dental practice has been designed and tested. An analysis of spatial resolution requirements and field tests by dentists show that a system built with presently available, PC-based, image processing equipment can provide diagnostic quality images without special digital image processing. By giving the dentist a tool to digitally enhance x-ray images, increased diagnostic capabilities can be achieved. Very simple image processing procedures such as linear and non-linear contrast expansion, edge enhancement, and image zooming can be shown to be very effective. In addition to providing enhanced imagery in the dentist's treatment room, the system is designed to be a fully automated, dental records management system. It is envisioned that a patient's record, including x-rays and tooth charts, may be retrieved from optical disk storage as the patient enters the office. Dental procedures undertaken during the visit may be entered into the record via the imaging workstation by the dentist or the dental assistant. Patient billing and records keeping may be generated automatically.

  14. Using process mining for automatic support of clinical pathways design.

    PubMed

    Fernandez-Llatas, Carlos; Valdivieso, Bernardo; Traver, Vicente; Benedi, Jose Miguel

    2015-01-01

    The creation of tools supporting the automatization of the standardization and continuous control of healthcare processes can become a significant helping tool for clinical experts and healthcare systems willing to reduce variability in clinical practice. The reduction in the complexity of design and deployment of standard Clinical Pathways can enhance the possibilities for effective usage of computer assisted guidance systems for professionals and assure the quality of the provided care. Several technologies have been used in the past for trying to support these activities but they have not been able to generate the disruptive change required to foster the general adoption of standardization in this domain due to the high volume of work, resources, and knowledge required to adequately create practical protocols that can be used in practice. This chapter proposes the use of the PALIA algorithm, based in Activity-Based process mining techniques, as a new technology to infer the actual processes from the real execution logs to be used in the design and quality control of healthcare processes.

  15. Moving bed biofilm reactor technology: process applications, design, and performance.

    PubMed

    McQuarrie, James P; Boltz, Joshua P

    2011-06-01

    The moving bed biofilm reactor (MBBR) can operate as a 2- (anoxic) or 3-(aerobic) phase system with buoyant free-moving plastic biofilm carriers. These systems can be used for municipal and industrial wastewater treatment, aquaculture, potable water denitrification, and, in roughing, secondary, tertiary, and sidestream applications. The system includes a submerged biofilm reactor and liquid-solids separation unit. The MBBR process benefits include the following: (1) capacity to meet treatment objectives similar to activated sludge systems with respect to carbon-oxidation and nitrogen removal, but requires a smaller tank volume than a clarifier-coupled activated sludge system; (2) biomass retention is clarifier-independent and solids loading to the liquid-solids separation unit is reduced significantly when compared with activated sludge systems; (3) the MBBR is a continuous-flow process that does not require a special operational cycle for biofilm thickness, L(F), control (e.g., biologically active filter backwashing); and (4) liquid-solids separation can be achieved with a variety of processes, including conventional and compact high-rate processes. Information related to system design is fragmented and poorly documented. This paper seeks to address this issue by summarizing state-of-the art MBBR design procedures and providing the reader with an overview of some commercially available systems and their components.

  16. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development.

  17. Lessons from nature: computational design of biomimetic compounds and processes.

    PubMed

    Bozkurt, Esra; Ashari, Negar; Browning, Nicholas; Brunk, Elizabeth; Campomanesa, Pablo; Perez, Marta A S; Rothlisberger, Ursula

    2014-09-01

    Through millions of years of evolution, Nature has accomplished the development of highly efficient and sustainable processes and the idea to understand and copy natural strategies is therefore very appealing. However, in spite of intense experimental and computational research, it has turned out to be a difficult task to design efficient biomimetic systems. Here we discuss a novel strategy for the computational design of biomimetic compounds and processes that consists of i) target selection; ii) atomistic and electronic characterization of the wild type system and the biomimetic compounds; iii) identification of key descriptors through feature selection iv) choice of biomimetic template and v) efficient search of chemical and sequence space for optimization of the biomimetic system. As a proof-of-principles study, this general approach is illustrated for the computational design of a 'green' catalyst mimicking the action of the zinc metalloenzyme Human Carbonic Anhydrase (HCA). HCA is a natural model for CO2 fixation since the enzyme is able to convert CO2 into bicarbonate. Very recently, a weakly active HCA mimic based on a trihelical peptide bundle was synthetized. We have used quantum mechanical/molecular mechanical (QM/MM) Car-Parrinello simulations to study the mechanisms of action of HCA and its peptidic mimic and employed the obtained information to guide the design of improved biomimetic analogues. Applying a genetic algorithm based optimization procedure, we were able to re-engineer and optimize the biomimetic system towards its natural counter part. In a second example, we discuss a similar strategy for the design of biomimetic sensitizers for use in dye-sensitized solar cells.

  18. System design and performances of ASTER Level-1 data processing

    NASA Astrophysics Data System (ADS)

    Nishida, Sumiyuki; Hachiya, Jun; Matsumoto, Ken; Fujisada, Hiroyuki; Kato, Masatane

    1998-12-01

    ASTER is a multispectral imager which covers wide spectral region from visible to thermal infrared with 14 spectral bands, and will fly on EOS-AM1 in 1999. To meet this wide spectral coverage, ASTER has three optical sensing subsystems (multi-telescope system), VNIR, SWIR and TIR. This multi- telescope configuration requires highly refined ground processing for the generation of Level-1 data products that are radiometrically calibrated and geometrically corrected. A prototype Level-1 processing software system is developed to satisfy these requirements. System design concept adopted includes; (1) 'Automatic Processing,' (2)'ALL-IN-ONE-CONCEPT' in which the processing is carried out using information included in Level-0 data product only, (3) 'MODULE INDEPENDENCE' in which only process control module independently control other modules to change any operational conditions. (4) 'FLEXIBILITY' in which important operation parameters are set from an external component to make the processing condition change easier. The adaptability and the performance of the developed software system are evaluated using simulation data.

  19. Mechanical design and design processes for the Telescope Optical Assembly of the Optical Communications Demonstrator

    NASA Astrophysics Data System (ADS)

    von Lossberg, Bryan R.

    1994-08-01

    A mechanical design has been developed for the Telescope Optical Assembly (TOA) of the Optical Communications Demonstrator (OCD). The TOA is the portion of the OCD instrument that integrates all the optical elements of the system with the exception of the Laser Transmitter Assembly (LXA) which is fiber coupled to the TOA. The TOA structure is composed primarily of aluminum components with some use of steel and invar. The assembly is contained within a 16 cm MUL 20 cm X 33 cm envelope and has an estimated mass of 5.5 kg. The mechanical design was developed using Computervision's CADDS 5 computer aided design software. Code V optical design data was used as a primary input and was efficiently and accurately transferred form the optical designer to the mechanical designer through the use of IGES files. In addition to enabling rapid transfer of the initial optical design as well as subsequent optical design refinements, the IGES transfer process was also used to expedite preliminary thermal and dynamic analyses.

  20. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  1. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  2. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  3. Design of educational artifacts as support to learning process.

    PubMed

    Resende, Adson Eduardo; Vasconcelos, Flávio Henrique

    2012-01-01

    The aim of this paper is to identify utilization schemes developed by students and teachers in their interaction with educational workstations in the electronic measurement and instrumentation laboratory at the Department of Electrical Engineering in the Federal University of Minas Gerais (UFMG), Brazil. After that, these schemes were used to design a new workstation. For this, it was important to bear in mind that the mentioned artifacts contain two key characteristics: (1) one from the designers themselves, resulting from their experience and their technical knowledge of what they are designing and (2) the experience from users and the means through which they take advantage of and develop these artifacts, in turn rendering them appropriate to perform the proposed task - the utilization schemes developed in the process of mediation between the user and the artifact. The satisfactory fusion of these two points makes these artifacts a functional unit - the instruments. This research aims to demonstrate that identifying the utilization schemes by taking advantage of user experience and incorporating this within the design, facilitates its appropriation and, consequently, its efficiency as an instrument of learning.

  4. Safeguards design strategies: designing and constructing new uranium and plutonium processing facilities in the United States

    SciTech Connect

    Scherer, Carolynn P; Long, Jon D

    2010-09-28

    In the United States, the Department of Energy (DOE) is transforming its outdated and oversized complex of aging nuclear material facilities into a smaller, safer, and more secure National Security Enterprise (NSE). Environmental concerns, worker health and safety risks, material security, reducing the role of nuclear weapons in our national security strategy while maintaining the capability for an effective nuclear deterrence by the United States, are influencing this transformation. As part of the nation's Uranium Center of Excellence (UCE), the Uranium Processing Facility (UPF) at the Y-12 National Security Complex in Oak Ridge, Tennessee, will advance the U.S.'s capability to meet all concerns when processing uranium and is located adjacent to the Highly Enriched Uranium Materials Facility (HEUMF), designed for consolidated storage of enriched uranium. The HEUMF became operational in March 2010, and the UPF is currently entering its final design phase. The designs of both facilities are for meeting anticipated security challenges for the 21st century. For plutonium research, development, and manufacturing, the Chemistry and Metallurgy Research Replacement (CMRR) building at the Los Alamos National Laboratory (LANL) in Los Alamos, New Mexico is now under construction. The first phase of the CMRR Project is the design and construction of a Radiological Laboratory/Utility/Office Building. The second phase consists of the design and construction of the Nuclear Facility (NF). The National Nuclear Security Administration (NNSA) selected these two sites as part of the national plan to consolidate nuclear materials, provide for nuclear deterrence, and nonproliferation mission requirements. This work examines these two projects independent approaches to design requirements, and objectives for safeguards, security, and safety (3S) systems as well as the subsequent construction of these modern processing facilities. Emphasis is on the use of Safeguards-by-Design (SBD

  5. A process for free-space laser communications system design

    NASA Astrophysics Data System (ADS)

    Walther, Frederick G.; Moores, John D.; Murphy, Robert J.; Michael, Steven; Nowak, George A.

    2009-08-01

    We present a design methodology for free-space laser communications systems. The first phase includes a characterization through numerical simulations of the channel to evaluate the range of extinction and scintillation. The second phase is the selection of fade mitigation schemes, which would incorporate pointing, acquisition, tracking, and communication system parameters specifically tailored to the channel. Ideally, the process would include sufficient flexibility to adapt to a wide range of channel conditions. We provide an example of the successful application of this design approach to a recent set of field experiments. This work was sponsored by the Department of Defense, RRCO DDR&E, under Air Force Contract FA8721-05-C-0002. Opinions, interpretations, conclusions and recommendations are those of the authors and are not necessarily endorsed by the United States Government.

  6. Demo III processing architecture trades and preliminary design

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.; Cory, Phil; Peterman, Pete

    1999-01-01

    This paper will provide a summary of the methodology, metrics, analysis, and trade study efforts for the preliminary design o the Vetronics Processing Architecture (PA) system based on the Demo III Experimental Unmanned Ground Vehicle (XUV) program requirements. We will document and describe both the provided and analytically derived system requirements expressed by the proposal. Our experience based on previous mobility and Reconnaissance, Surveillance, Targeting, Acquisition systems designed and implemented for Demo II Semi-Autonomous Surrogate Vehicle and Mobile Detection, Assessment and Response System will be used to describe lessons learned as applied to the XUV in PA architecture, Single Board Computers, Card Cage Buses, Real-Time and Non Real-Time processor and Card Cage to Card Cage Communications, and Imaging and Radar pre-processors selection and choices. We have selected an initial architecture methodology.

  7. Waste receiving and processing plant control system; system design description

    SciTech Connect

    LANE, M.P.

    1999-02-24

    The Plant Control System (PCS) is a heterogeneous computer system composed of numerous sub-systems. The PCS represents every major computer system that is used to support operation of the Waste Receiving and Processing (WRAP) facility. This document, the System Design Description (PCS SDD), includes several chapters and appendices. Each chapter is devoted to a separate PCS sub-system. Typically, each chapter includes an overview description of the system, a list of associated documents related to operation of that system, and a detailed description of relevant system features. Each appendice provides configuration information for selected PCS sub-systems. The appendices are designed as separate sections to assist in maintaining this document due to frequent changes in system configurations. This document is intended to serve as the primary reference for configuration of PCS computer systems. The use of this document is further described in the WRAP System Configuration Management Plan, WMH-350, Section 4.1.

  8. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  9. Process design of press hardening with gradient material property influence

    SciTech Connect

    Neugebauer, R.; Schieck, F.; Rautenstrauch, A.

    2011-05-04

    Press hardening is currently used in the production of automotive structures that require very high strength and controlled deformation during crash tests. Press hardening can achieve significant reductions of sheet thickness at constant strength and is therefore a promising technology for the production of lightweight and energy-efficient automobiles. The manganese-boron steel 22MnB5 have been implemented in sheet press hardening owing to their excellent hot formability, high hardenability, and good temperability even at low cooling rates. However, press-hardened components have shown poor ductility and cracking at relatively small strains. A possible solution to this problem is a selective increase of steel sheet ductility by press hardening process design in areas where the component is required to deform plastically during crash tests. To this end, process designers require information about microstructure and mechanical properties as a function of the wide spectrum of cooling rates and sequences and austenitizing treatment conditions that can be encountered in production environments. In the present work, a Continuous Cooling Transformation (CCT) diagram with corresponding material properties of sheet steel 22MnB5 was determined for a wide spectrum of cooling rates. Heating and cooling programs were conducted in a quenching dilatometer. Motivated by the importance of residual elasticity in crash test performance, this property was measured using a micro-bending test and the results were integrated into the CCT diagrams to complement the hardness testing results. This information is essential for the process design of press hardening of sheet components with gradient material properties.

  10. Design of experiments for thermal protection system process optimization

    NASA Astrophysics Data System (ADS)

    Longani, Hans R.

    2000-01-01

    Solid Rocket Booster (SRB) structures were protected from heating due to aeroshear, radiation and plume impingement by a Thermal Protection System (TPS) known as Marshall Sprayable Ablative (MSA-2). MSA-2 contains Volatile Organic Compounds (VOCs) which due to strict environmental legislation was eliminated. MSA-2 was also classified as hazardous waste, which makes the disposal very costly. Marshall Convergent Coating (MCC-1) replaced MSA-2, and eliminated the use of solvents by delivering the dry filler materials and the fluid resin system to a patented spray gun which utilizes Convergent Spray Technologies spray process. The selection of TPS material was based on risk assessment, performance comparisons, processing, application and cost. Design of Experiments technique was used to optimize the spraying parameters. .

  11. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-01-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H[sub 2] mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO[sub x] (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  12. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-11-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H{sub 2} mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO{sub x} (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  13. [Design of an HACCP program for a cocoa processing facility].

    PubMed

    López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar

    2012-12-01

    The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well

  14. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  15. Optimizing product life cycle processes in design phase

    NASA Astrophysics Data System (ADS)

    Faneye, Ola. B.; Anderl, Reiner

    2002-02-01

    Life cycle concepts do not only serve as basis in assisting product developers understand the dependencies between products and their life cycles, they also help in identifying potential opportunities for improvement in products. Common traditional concepts focus mainly on energy and material flow across life phases, necessitating the availability of metrics derived from a reference product. Knowledge of life cycle processes won from an existing product is directly reused in its redesign. Depending on sales volume nevertheless, the environmental impact before product optimization can be substantial. With modern information technologies today, computer-aided life cycle methodologies can be applied well before product use. On the basis of a virtual prototype, life cycle processes are analyzed and optimized, using simulation techniques. This preventive approach does not only help in minimizing (or even eliminating) environmental burdens caused by product, costs incurred due to changes in real product can also be avoided. The paper highlights the relationship between product and life cycle and presents a computer-based methodology for optimizing the product life cycle during design, as presented by SFB 392: Design for Environment - Methods and Tools at Technical University, Darmstadt.

  16. Preliminary Process Design of ITER ELM Coil Bracket Brazing

    NASA Astrophysics Data System (ADS)

    LI, Xiangbin; SHI, Yi

    2015-03-01

    With the technical requirement of the International Thermonuclear Experimental Reactor (ITER) project, the manufacture and assembly technology of the mid Edge Localized Modes (ELM) coil was developed by the Institute of Plasma Physics, Chinese Academy of Science (ASIPP). As the gap between the bracket and the Stainless Steel jacketed and Mineral Insulated Conductor (SSMIC) can be larger than 0.5 mm instead of 0.01 mm to 0.1 mm as in normal industrial cases, the process of mid ELM coil bracket brazing to the SSMICT becomes quiet challenging, from a technical viewpoint. This paper described the preliminary design of ELM coil bracket brazing to the SSMIC process, the optimal bracket brazing curve and the thermal simulation of the bracket furnace brazing method developed by ANSYS. BAg-6 foil (Bag50Cu34Zn16) plus BAg-1a paste (Bag45CuZnCd) solders were chosen as the brazing filler. By testing an SSMICT prototype, it is shown that the average gap between the bracket and the SSMIC could be controlled to 0.2-0.3 mm, and that there were few voids in the brazing surface. The results also verified that the preliminary design had a favorable heat conducting performance in the bracket.

  17. Design, processing, and testing of lsi arrays for space station

    NASA Technical Reports Server (NTRS)

    Lile, W. R.; Hollingsworth, R. J.

    1972-01-01

    The design of a MOS 256-bit Random Access Memory (RAM) is discussed. Technological achievements comprise computer simulations that accurately predict performance; aluminum-gate COS/MOS devices including a 256-bit RAM with current sensing; and a silicon-gate process that is being used in the construction of a 256-bit RAM with voltage sensing. The Si-gate process increases speed by reducing the overlap capacitance between gate and source-drain, thus reducing the crossover capacitance and allowing shorter interconnections. The design of a Si-gate RAM, which is pin-for-pin compatible with an RCA bulk silicon COS/MOS memory (type TA 5974), is discussed in full. The Integrated Circuit Tester (ICT) is limited to dc evaluation, but the diagnostics and data collecting are under computer control. The Silicon-on-Sapphire Memory Evaluator (SOS-ME, previously called SOS Memory Exerciser) measures power supply drain and performs a minimum number of tests to establish operation of the memory devices. The Macrodata MD-100 is a microprogrammable tester which has capabilities of extensive testing at speeds up to 5 MHz. Beam-lead technology was successfully integrated with SOS technology to make a simple device with beam leads. This device and the scribing are discussed.

  18. Simulative design and process optimization of the two-stage stretch-blow molding process

    SciTech Connect

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  19. Simulative design and process optimization of the two-stage stretch-blow molding process

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  20. Design process of an area-efficient photobioreactor.

    PubMed

    Zijffers, Jan-Willem F; Janssen, Marcel; Tramper, Johannes; Wijffels, René H

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved.

  1. Integrating optical fabrication and metrology into the optical design process.

    PubMed

    Harvey, James E

    2015-03-20

    The recent validation of a generalized linear systems formulation of surface scatter theory and an analysis of image degradation due to surface scatter in the presence of aberrations has provided credence to the development of a systems engineering analysis of image quality as degraded not only by diffraction effects and geometrical aberrations, but to scattering effects due to residual optical fabrication errors as well. This generalized surface scatter theory provides insight and understanding by characterizing surface scatter behavior with a surface transfer function closely related to the modulation transfer function of classical image formation theory. Incorporating the inherently band-limited relevant surface roughness into the surface scatter theory provides mathematical rigor into surface scatter analysis, and implementing a fast Fourier transform algorithm with logarithmically spaced data points facilitates the practical calculation of scatter behavior from surfaces with a large dynamic range of relevant spatial frequencies. These advances, combined with the continuing increase in computer speed, leave the optical design community in a position to routinely derive the optical fabrication tolerances necessary to satisfy specific image quality requirements during the design phase of a project; i.e., to integrate optical metrology and fabrication into the optical design process.

  2. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  3. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... procurement process? 636.109 Section 636.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF... process relate to the design-build procurement process? The purpose of this section is to ensure that... design-build procurement process: (a) The contracting agency may: (1) Issue an RFQ prior to...

  4. Space Shuttle Ascent Flight Design Process: Evolution and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Picka, Bret A.; Glenn, Christopher B.

    2011-01-01

    The Space Shuttle Ascent Flight Design team is responsible for defining a launch to orbit trajectory profile that satisfies all programmatic mission objectives and defines the ground and onboard reconfiguration requirements for this high-speed and demanding flight phase. This design, verification and reconfiguration process ensures that all applicable mission scenarios are enveloped within integrated vehicle and spacecraft certification constraints and criteria, and includes the design of the nominal ascent profile and trajectory profiles for both uphill and ground-to-ground aborts. The team also develops a wide array of associated training, avionics flight software verification, onboard crew and operations facility products. These key ground and onboard products provide the ultimate users and operators the necessary insight and situational awareness for trajectory dynamics, performance and event sequences, abort mode boundaries and moding, flight performance and impact predictions for launch vehicle stages for use in range safety, and flight software performance. These products also provide the necessary insight to or reconfiguration of communications and tracking systems, launch collision avoidance requirements, and day of launch crew targeting and onboard guidance, navigation and flight control updates that incorporate the final vehicle configuration and environment conditions for the mission. Over the course of the Space Shuttle Program, ascent trajectory design and mission planning has evolved in order to improve program flexibility and reduce cost, while maintaining outstanding data quality. Along the way, the team has implemented innovative solutions and technologies in order to overcome significant challenges. A number of these solutions may have applicability to future human spaceflight programs.

  5. Climate Monitoring Satellite Designed in a Concurrent Engineering Process

    NASA Astrophysics Data System (ADS)

    Bauer, Waldemar; Braukhane, A.; Quantius, D.; Dumont, E.; Grundmann, J. T.; Romberg, O.

    An effective method of detecting Green House Gases (GHG CO2 and CH4) is using satellites, operating in Low Earth Orbit (LEO). Satellite based greenhouse gas emissions monitoring is challenging and shows an ambitions level of requirements. Until now for corresponding scientific payload it is common to use a purpose-built satellite bus, or to install the payload on board of a larger conventional satellite. These approaches fulfils all customer requirements but could be critical from a financial point of view. Between 2014 and 2020, no space-based CH4 detection and if at all limited CO2 detection capabilities are planned internationally. In order to fill this gap the Institute for Environmental Physics (IUP) of the University of Bremen plans a GHG satellite mission with near-surface sensitivity called "CarbonSat". It shall perform synchronous global atmospheric CO2 and CH4 observations with the accuracy, precision and coverage needed to significantly advance our knowledge about the sources and sinks of Green House Gases. In order to verify technical and financial opportunities of a small satellite a Concurrent Engi-neering Study (CE-study) has been performed at DLR Bremen, Germany. To reuse knowledge in compact satellite design, the Compact/SSB (Standard Satellite Bus) was chosen as baseline design. The SSB has been developed by DLR and was already used for BIRD (Bispectral Infra-Red Detection) mission but also adapted to the ongoing missions like TET (Technologie-Erprobungs-Trüger) or AsteroidFinder. This paper deals with the highly effective design process a within the DLR-CE-Facility and with the outcomes of the CE-study. It gives an overview of the design status as well as an outlook for comparable missions.

  6. Preconceptual design of a salt splitting process using ceramic membranes

    SciTech Connect

    Kurath, D.E.; Brooks, K.P.; Hollenberg, G.W.; Clemmer, R.; Balagopal, S.; Landro, T.; Sutija, D.P.

    1997-01-01

    Inorganic ceramic membranes for salt splitting of radioactively contaminated sodium salt solutions are being developed for treating U. S. Department of Energy tank wastes. The process consists of electrochemical separation of sodium ions from the salt solution using sodium (Na) Super Ion Conductors (NaSICON) membranes. The primary NaSICON compositions being investigated are based on rare- earth ions (RE-NaSICON). Potential applications include: caustic recycling for sludge leaching, regenerating ion exchange resins, inhibiting corrosion in carbon-steel tanks, or retrieving tank wastes; reducing the volume of low-level wastes volume to be disposed of; adjusting pH and reducing competing cations to enhance cesium ion exchange processes; reducing sodium in high-level-waste sludges; and removing sodium from acidic wastes to facilitate calcining. These applications encompass wastes stored at the Hanford, Savannah River, and Idaho National Engineering Laboratory sites. The overall project objective is to supply a salt splitting process unit that impacts the waste treatment and disposal flowsheets and meets user requirements. The potential flowsheet impacts include improving the efficiency of the waste pretreatment processes, reducing volume, and increasing the quality of the final waste disposal forms. Meeting user requirements implies developing the technology to the point where it is available as standard equipment with predictable and reliable performance. This report presents two preconceptual designs for a full-scale salt splitting process based on the RE-NaSICON membranes to distinguish critical items for testing and to provide a vision that site users can evaluate.

  7. Heat and power networks in process design, part II, design procedure for equipment selection and process matching

    SciTech Connect

    Townsend, D.W.; Linnhoff, B.

    1983-09-01

    In Part I, criteria for heat engine and heat pump placement in chemical process networks were derived, based on the ''temperature interval'' (T.I) analysis of the heat exchanger network problem. Using these criteria, this paper gives a method for identifying the best outline design for any combined system of chemical process, heat engines, and heat pumps. The method eliminates inferior alternatives early, and positively leads on to the most appropriate solution. A graphical procedure based on the T.I. analysis forms the heart of the approach, and the calculations involved are simple enough to be carried out on, say, a programmable calculator. Application to a case study is demonstrated. Optimization methods based on this procedure are currently under research.

  8. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  9. Materials, design and processing of air encapsulated MEMS packaging

    NASA Astrophysics Data System (ADS)

    Fritz, Nathan T.

    This work uses a three-dimensional air cavity technology to improve the fabrication, and functionality of microelectronics devices, performance of on-board transmission lines, and packaging of micro-electromechanical systems (MEMS). The air cavity process makes use of the decomposition of a patterned sacrificial polymer followed by the diffusion of its by-products through a curing polymer overcoat to obtain the embedded air structure. Applications and research of air cavities have focused on simple designs that concentrate on the size and functionality of the particular device. However, a lack of guidelines for fabrication, materials used, and structural design has led to mechanical stability issues and processing refinements. This work investigates improved air gap cavities for use in MEMS packaging processes, resulting in fewer fabrication flaws and lower cost. The identification of new materials, such as novel photo-definable organic/inorganic hybrid polymers, was studied for increased strength and rigidity due to their glass-like structure. A novel epoxy polyhedral oligomeric silsesquioxane (POSS) material was investigated and characterized for use as a photodefineable, permanent dielectrics with improved mechanical properties. The POSS material improved the air gap fabrication because it served as a high-selectivity etch mask for patterning sacrificial materials as well as a cavity overcoat material with improved rigidity. An investigation of overcoat thickness and decomposition kinetics provided a fundamental understanding of the properties that impart mechanical stability to cavities of different shape and volume. Metallization of the cavities was investigated so as to provide hermetic sealing and improved cavity strength. The improved air cavity, wafer-level packages were tested using resonator-type devices and chip-level lead frame packaging. The air cavity package was molded under traditional lead frame molding pressures and tested for mechanical

  10. From Safe Nanomanufacturing to Nanosafe-by-Design processes

    NASA Astrophysics Data System (ADS)

    Schuster, F.; Lomello, F.

    2013-04-01

    Industrial needs in terms of multifunctional components are increasing. Many sectors are concerned, from the integrated direct nanoparticles production to the emerging combinations which include the metal matrix composites (MMC), ductile ceramics and ceramic matrix composites, polymer matrix composites (PMC) for bulk application and advanced surface coatings in the fields of automotive, aerospace, energy production and building applications. Moreover, domains with a planetary impact such as environmental issues, as well as aspects for instance health (toxicity) and hazard assessment (ignition and explosion severity) were also taken into account. Nanotechnologies play an important role in promoting innovation in design and realization of multifunctional products for the future, either by improving usual products or creating new functions and/or new products. Nevertheless, this huge evolution in terms of materials could only be promoted by increasing the social acceptance and by acting on the different main technological and economic challenges and developing safe oriented processes. Nowadays, a huge number of developments of nanoparticles are potentially industrial up-scalable. However, some doubts exist about the handling's safety of the current technologies. For these reasons, the main purpose was to develop a self-monitored automation in the production line coupling different techniques in order to simplify processes such as in-situ growth nanoparticles into a nanostructured matrix, over different substrates and/or the nanopowders synthesis, functionalization, dry or wet safe recovery system, granulation, consolidation in single-step, by monitoring at real time the processing parameters such as powder stoichiometry. With the aim of assuring the traceability of the product during the whole life, starting from the conception and including the R&D, the distribution and the use were also considered. The optimization in terms of processing, recovery and conditioning

  11. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    NASA Astrophysics Data System (ADS)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  12. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  13. Superior metallic alloys through rapid solidification processing (RSP) by design

    SciTech Connect

    Flinn, J.E.

    1995-05-01

    Rapid solidification processing using powder atomization methods and the control of minor elements such as oxygen, nitrogen, and carbon can provide metallic alloys with superior properties and performance compared to conventionally processing alloys. Previous studies on nickel- and iron-base superalloys have provided the baseline information to properly couple RSP with alloy composition, and, therefore, enable alloys to be designed for performance improvements. The RSP approach produces powders, which need to be consolidated into suitable monolithic forms. This normally involves canning, consolidation, and decanning of the powders. Canning/decanning is expensive and raises the fabrication cost significantly above that of conventional, ingot metallurgy production methods. The cost differential can be offset by the superior performance of the RSP metallic alloys. However, without the performance database, it is difficult to convince potential users to adopt the RSP approach. Spray casting of the atomized molten droplets into suitable preforms for subsequent fabrication can be cost competitive with conventional processing. If the fine and stable microstructural features observed for the RSP approach are preserved during spray casing, a cost competitive product can be obtained that has superior properties and performance that cannot be obtained by conventional methods.

  14. On the optimal design of the disassembly and recovery processes.

    PubMed

    Xanthopoulos, A; Iakovou, E

    2009-05-01

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  15. Conceptual Design for the Pilot-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Jones, Susan A.; Rapko, Brian M.

    2014-08-05

    This report describes a conceptual design for a pilot-scale capability to produce plutonium oxide for use as exercise and reference materials, and for use in identifying and validating nuclear forensics signatures associated with plutonium production. This capability is referred to as the Pilot-scale Plutonium oxide Processing Unit (P3U), and it will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including plutonium dioxide (PuO2) dissolution, purification of the Pu by ion exchange, precipitation, and conversion to oxide by calcination.

  16. Process design and evaluation of production of bioethanol and β-lactam antibiotic from lignocellulosic biomass.

    PubMed

    Kim, Sung Bong; Park, Chulhwan; Kim, Seung Wook

    2014-11-01

    To design biorefinery processes producing bioethanol from lignocellulosic biomass with dilute acid pretreatment, biorefinery processes were simulated using the SuperPro Designer program. To improve the efficiency of biomass use and the economics of biorefinery, additional pretreatment processes were designed and evaluated, in which a combined process of dilute acid and aqueous ammonia pretreatments, and a process of waste media containing xylose were used, for the production of 7-aminocephalosporanic acid. Finally, the productivity and economics of the designed processes were compared.

  17. Tools for efficient design of multicomponent separation processes

    NASA Astrophysics Data System (ADS)

    Huff, Joshua Lee

    formulation and the relative effect of capital and operating cost is weighed for an example feed. Previous methods based on Underwood's equations have no accounting for the temperature at which utilities are required. To account for this, a thermodynamic efficiency function is developed which allows the complete search space to be ranklisted in order of the exergy loss occurring within the configuration. Examining these results shows that this objective function favors configurations which move their reboiler and condenser duties to milder temperature exchangers. A graphical interface is presented which allows interpretation of any of the above results in a quick and intuitive fashion, complete with system flow and composition data and the ability to filter the complete search space based on numerical and structural criteria. This provides a unique way to compare and contrast configurations as well as allowing considerations like column retrofit and maximum controllability to be considered. Using all five of these screening techniques, the traditional intuition-based methods of separations process design can be augmented with analytical and algorithmic tools which enable selection of a process design with low cost and high efficiency.

  18. Process and Prospects for the Designed Hydrograph, Lower Missouri River

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Galat, D. L.; Hay, C. H.

    2005-05-01

    The flow regime of the Lower Missouri River (LMOR, Gavins Point, SD to St. Louis, MO) is being redesigned to restore elements of natural variability while maintaining project purposes such as power production, flood control, water supply, and navigation. Presently, an experimental hydrograph alteration is planned for Spring, 2006. Similar to many large, multi-purpose rivers, the ongoing design process involves negotiation among many management and stakeholder groups. The negotiated process has simplified the hydrograph into two key elements -- the spring rise and the summer low - with emphasis on the influence of these elements on three threatened or endangered species. The spring rise has been hypothesized to perform three functions: build sandbars for nesting of the interior least tern and piping plover, provide episodic connectivity with low-lying flood plain, and provide a behavioral spawning cue for the pallid sturgeon. Among these, most emphasis has been placed on the spawning cue because concerns about downstream flood hazards have limited flow magnitudes to those that are thought to be geomorphically ineffective, and channelization and incision provide little opportunity for moderate flows to connect to the flood plain. Our analysis of the natural hydrologic regime provides some insight into possible spring rise design elements, including timing, rate of rise and fall, and length of spring flow pulses. The summer low has been hypothesized to emerge sandbars for nesting and to maximize area of shallow, slow water for rearing of larval and juvenile fish. Re-engineering of the navigation channel to provide greater diversity of habitat during navigation flows has been offered as an alternative to the summer low. Our analysis indicates that re-engineering has potential to increase habitat availability substantially, but the ecological results are so-far unknown. The designed hydrograph that emerges from the multi-objective process will likely represent a

  19. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  20. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use

  1. Identifying User Needs and the Participative Design Process

    NASA Astrophysics Data System (ADS)

    Meiland, Franka; Dröes, Rose-Marie; Sävenstedt, Stefan; Bergvall-Kåreborn, Birgitta; Andersson, Anna-Lena

    As the number of persons with dementia increases and also the demands on care and support at home, additional solutions to support persons with dementia are needed. The COGKNOW project aims to develop an integrated, user-driven cognitive prosthetic device to help persons with dementia. The project focuses on support in the areas of memory, social contact, daily living activities and feelings of safety. The design process is user-participatory and consists of iterative cycles at three test sites across Europe. In the first cycle persons with dementia and their carers (n = 17) actively participated in the developmental process. Based on their priorities of needs and solutions, on their disabilities and after discussion between the team, a top four list of Information and Communication Technology (ICT) solutions was made and now serves as the basis for development: in the area of remembering - day and time orientation support, find mobile service and reminding service, in the area of social contact - telephone support by picture dialling, in the area of daily activities - media control support through a music playback and radio function, and finally, in the area of safety - a warning service to indicate when the front door is open and an emergency contact service to enhance feelings of safety. The results of this first project phase show that, in general, the people with mild dementia as well as their carers were able to express and prioritize their (unmet) needs, and the kind of technological assistance they preferred in the selected areas. In next phases it will be tested if the user-participatory design and multidisciplinary approach employed in the COGKNOW project result in a user-friendly, useful device that positively impacts the autonomy and quality of life of persons with dementia and their carers.

  2. Design Process of an Area-Efficient Photobioreactor

    PubMed Central

    Janssen, Marcel; Tramper, Johannes; Wijffels, René H.

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved. PMID:18266033

  3. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  4. Experimental design for dynamics identification of cellular processes.

    PubMed

    Dinh, Vu; Rundell, Ann E; Buzzard, Gregery T

    2014-03-01

    We address the problem of using nonlinear models to design experiments to characterize the dynamics of cellular processes by using the approach of the Maximally Informative Next Experiment (MINE), which was introduced in W. Dong et al. (PLoS ONE 3(8):e3105, 2008) and independently in M.M. Donahue et al. (IET Syst. Biol. 4:249-262, 2010). In this approach, existing data is used to define a probability distribution on the parameters; the next measurement point is the one that yields the largest model output variance with this distribution. Building upon this approach, we introduce the Expected Dynamics Estimator (EDE), which is the expected value using this distribution of the output as a function of time. We prove the consistency of this estimator (uniform convergence to true dynamics) even when the chosen experiments cluster in a finite set of points. We extend this proof of consistency to various practical assumptions on noisy data and moderate levels of model mismatch. Through the derivation and proof, we develop a relaxed version of MINE that is more computationally tractable and robust than the original formulation. The results are illustrated with numerical examples on two nonlinear ordinary differential equation models of biomolecular and cellular processes.

  5. HYBRID SULFUR PROCESS REFERENCE DESIGN AND COST ANALYSIS

    SciTech Connect

    Gorensek, M.; Summers, W.; Boltrunis, C.; Lahoda, E.; Allen, D.; Greyvenstein, R.

    2009-05-12

    This report documents a detailed study to determine the expected efficiency and product costs for producing hydrogen via water-splitting using energy from an advanced nuclear reactor. It was determined that the overall efficiency from nuclear heat to hydrogen is high, and the cost of hydrogen is competitive under a high energy cost scenario. It would require over 40% more nuclear energy to generate an equivalent amount of hydrogen using conventional water-cooled nuclear reactors combined with water electrolysis compared to the proposed plant design described herein. There is a great deal of interest worldwide in reducing dependence on fossil fuels, while also minimizing the impact of the energy sector on global climate change. One potential opportunity to contribute to this effort is to replace the use of fossil fuels for hydrogen production by the use of water-splitting powered by nuclear energy. Hydrogen production is required for fertilizer (e.g. ammonia) production, oil refining, synfuels production, and other important industrial applications. It is typically produced by reacting natural gas, naphtha or coal with steam, which consumes significant amounts of energy and produces carbon dioxide as a byproduct. In the future, hydrogen could also be used as a transportation fuel, replacing petroleum. New processes are being developed that would permit hydrogen to be produced from water using only heat or a combination of heat and electricity produced by advanced, high temperature nuclear reactors. The U.S. Department of Energy (DOE) is developing these processes under a program known as the Nuclear Hydrogen Initiative (NHI). The Republic of South Africa (RSA) also is interested in developing advanced high temperature nuclear reactors and related chemical processes that could produce hydrogen fuel via water-splitting. This report focuses on the analysis of a nuclear hydrogen production system that combines the Pebble Bed Modular Reactor (PBMR), under development by

  6. Singlet oxygen sensitizing materials based on porous silicone: photochemical characterization, effect of dye reloading and application to water disinfection with solar reactors.

    PubMed

    Manjón, Francisco; Santana-Magaña, Montserrat; García-Fresnadillo, David; Orellana, Guillermo

    2010-06-01

    Photogeneration of singlet molecular oxygen ((1)O(2)) is applied to organic synthesis (photooxidations), atmosphere/water treatment (disinfection), antibiofouling materials and in photodynamic therapy of cancer. In this paper, (1)O(2) photosensitizing materials containing the dyes tris(4,4'-diphenyl-2,2'-bipyridine)ruthenium(II) (1, RDB(2+)) or tris(4,7-diphenyl-1,10-phenanthroline)ruthenium(II) (2, RDP(2+)), immobilized on porous silicone (abbreviated RDB/pSil and RDP/pSil), have been produced and tested for waterborne Enterococcus faecalis inactivation using a laboratory solar simulator and a compound parabolic collector (CPC)-based solar photoreactor. In order to investigate the feasibility of its reuse, the sunlight-exposed RDP/pSil sensitizing material (RDP/pSil-a) has been reloaded with RDP(2+) (RDP/pSil-r). Surprisingly, results for bacteria inactivation with the reloaded material have demonstrated a 4-fold higher efficiency compared to those of either RDP/pSil-a, unused RDB/pSil and the original RDP/pSil. Surface and bulk photochemical characterization of the new material (RDP/pSil-r) has shown that the bactericidal efficiency enhancement is due to aggregation of the silicone-supported photosensitizer on the surface of the polymer, as evidenced by confocal fluorescence lifetime imaging microscopy (FLIM). Photogenerated (1)O(2) lifetimes in the wet sensitizer-doped silicone have been determined to be ten times longer than in water. These facts, together with the water rheology in the solar reactor and the interfacial production of the biocidal species, account for the more effective disinfection observed with the reloaded photosensitizing material. These results extend and improve the operational lifetime of photocatalytic materials for point-of-use (1)O(2)-mediated solar water disinfection.

  7. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.109 How does the NEPA process relate to the design-build procurement process? The purpose of this section is to ensure that... 23 Highways 1 2011-04-01 2011-04-01 false How does the NEPA process relate to the...

  8. Universal Design in Postsecondary Education: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design, "is…

  9. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  10. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  11. Design of the Laboratory-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Orton, Robert D.; Rapko, Brian M.; Smart, John E.

    2015-05-01

    This report describes a design for a laboratory-scale capability to produce plutonium oxide (PuO2) for use in identifying and validating nuclear forensics signatures associated with plutonium production, as well as for use as exercise and reference materials. This capability will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including PuO2 dissolution, purification of the Pu by ion exchange, precipitation, and re-conversion to PuO2 by calcination.

  12. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  13. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  14. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  15. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  16. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  17. Using GREENSCOPE Indicators for Sustainable Computer-Aided Process Evaluation and Design

    EPA Science Inventory

    Manufacturing sustainability can be increased by educating those who design, construct, and operate facilities, and by using appropriate tools for process evaluation and design. The U.S. Environmental Protection Agency's GREENSCOPE methodology and tool, for evaluation and design ...

  18. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  19. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  20. Rethinking the Systems Engineering Process in Light of Design Thinking

    DTIC Science & Technology

    2016-04-30

    draw final conclusions. Technical Rational Design The Technical Rational Design approach is a structured approach to design based on a problem...Konstruktionslehre: Methoden und Anwendung. Springer- Verlag. Patnaik, D. (2009). Wired to care: How companies prosper when they create widespread

  1. The Changing Metropolitan Designation Process and Rural America

    ERIC Educational Resources Information Center

    Slifkin, Rebecca T.; Randolph, Randy; Ricketts, Thomas C.

    2004-01-01

    In June 2003, the Office of Management and Budget (OMB) released new county-based designations of Core Based Statistical Areas (CBSAs), replacing Metropolitan Statistical Area designations that were last revised in 1990. In this article, the new designations are briefly described, and counties that have changed classifications are identified.…

  2. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  3. Direct selective laser sintering of high performance metals: Machine design, process development and process control

    NASA Astrophysics Data System (ADS)

    Das, Suman

    1998-11-01

    development of machine, processing and control technologies during this research effort enabled successful production of a number of integrally canned test specimens in Alloy 625 (InconelRTM 625 superalloy) and Ti-6Al-4V alloy. The overall goal of this research was to develop direct SLS of metals armed with a fundamental understanding of the underlying physics. The knowledge gained from experimental and analytical work is essential for three key objectives: machine design, process development and process control. (Abstract shortened by UMI.)

  4. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  5. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  6. Collaborative Course Design: Changing the Process, Acknowledging the Context, and Implications for Academic Development

    ERIC Educational Resources Information Center

    Ziegenfuss, Donna Harp; Lawler, Patricia A.

    2008-01-01

    This research study describes the experiences and perceptions of an instructor and an instructional design specialist who collaborated on the design and implementation of a university course using a new course design process. Findings uncovered differences between an informal collaboration process and the adaptation of that process for…

  7. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Procedure for acceptance of revisions of design, process... Device Components § 164.019-9 Procedure for acceptance of revisions of design, process, or materials. (a) The manufacturer shall not change the design, material, manufacturing process, or construction of...

  8. Disruption of adaptor protein 2μ (AP-2μ) in cochlear hair cells impairs vesicle reloading of synaptic release sites and hearing.

    PubMed

    Jung, SangYong; Maritzen, Tanja; Wichmann, Carolin; Jing, Zhizi; Neef, Andreas; Revelo, Natalia H; Al-Moyed, Hanan; Meese, Sandra; Wojcik, Sonja M; Panou, Iliana; Bulut, Haydar; Schu, Peter; Ficner, Ralf; Reisinger, Ellen; Rizzoli, Silvio O; Neef, Jakob; Strenzke, Nicola; Haucke, Volker; Moser, Tobias

    2015-11-03

    Active zones (AZs) of inner hair cells (IHCs) indefatigably release hundreds of vesicles per second, requiring each release site to reload vesicles at tens per second. Here, we report that the endocytic adaptor protein 2μ (AP-2μ) is required for release site replenishment and hearing. We show that hair cell-specific disruption of AP-2μ slows IHC exocytosis immediately after fusion of the readily releasable pool of vesicles, despite normal abundance of membrane-proximal vesicles and intact endocytic membrane retrieval. Sound-driven postsynaptic spiking was reduced in a use-dependent manner, and the altered interspike interval statistics suggested a slowed reloading of release sites. Sustained strong stimulation led to accumulation of endosome-like vacuoles, fewer clathrin-coated endocytic intermediates, and vesicle depletion of the membrane-distal synaptic ribbon in AP-2μ-deficient IHCs, indicating a further role of AP-2μ in clathrin-dependent vesicle reformation on a timescale of many seconds. Finally, we show that AP-2 sorts its IHC-cargo otoferlin. We propose that binding of AP-2 to otoferlin facilitates replenishment of release sites, for example, via speeding AZ clearance of exocytosed material, in addition to a role of AP-2 in synaptic vesicle reformation.

  9. A study of optimizing processes for metallized textile design application

    NASA Astrophysics Data System (ADS)

    Guo, Ronghui

    The purpose of this research is to find an optimum electroless plating process in order to obtain relatively low surface resistance, and improve functional properties and appearance of nickel-plated and copper-plated polyester fabrics. The optimum results indicate that the NiSO4 concentration and temperature of the bath in the plating process are most important factors influencing surface resistance of electroless nickel-plated polyester fabric. However, NiSO4 concentration and pH of the plating bath are most significant factors affecting electroless copper plating. The micro-structures and properties of nickel and copper, and nickel/copper multi-layer plated polyester fabrics have been studied. In the case of electroless nickel plating, the nickel deposit layer becomes more uniform and continuous when prepared at higher NiSO4 concentration and higher bath temperature. As for the electroless copper plating, the surface morphology of the copper deposits indicates that the average diameter of the particles is increased with the rise of NiSO4 concentration and pH. The surface morphology of nickel/copper multi-layer deposits reveals the presence of ultra-fine nodules and the deposits are compact and uniform in size. There is an increase in EMI SE with respect to the rise of Ni 2+ concentration and bath temperature for electroless nickel plating; and EMI SE increases with the rise of Ni2+ concentration and pH of the plating solution for electroless copper plating on polyester fabric. With the same deposit weight, the EMI SE of nickel/copper-plated fabric is greatly higher than that of the nickel-plated fabric, but slightly lower than that of the copper-plated fabric. However, the anti-corrosive property of nickel/copper-plated fabrics is significantly superior to the copper-plated fabrics, but slightly inferior to the nickel-plated fabric. Design application effects have been explored by the controlling plating conditions. The electroless plating parameters play an

  10. Development of Integrated Programs for Aerospace-vehicle Design (IPAD): Product manufacture interactions with the design process

    NASA Technical Reports Server (NTRS)

    Crowell, H. A.

    1979-01-01

    The product manufacturing interactions with the design process and the IPAD requirements to support the interactions are described. The data requirements supplied to manufacturing by design are identified and quantified. Trends in computer-aided manufacturing are discussed and the manufacturing process of the 1980's is anticipated.

  11. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  12. Integrating optical fabrication and metrology into the optical design process

    NASA Astrophysics Data System (ADS)

    Harvey, James E.

    2014-12-01

    Image degradation due to scattered radiation from residual optical fabrication errors is a serious problem in many short wavelength (X-ray/EUV) imaging systems. Most commercially-available image analysis codes (ZEMAX, Code V, ASAP, FRED, etc.) currently require the scatter behavior (BSDF data) to be provided as input in order to calculate the image quality of such systems. This BSDF data is difficult to measure and rarely available for the operational wavelengths of interest. Since the smooth-surface approximation is often not satisfied at these short wavelengths, the classical Rayleigh-Rice expression that indicates the BRDF is directly proportional to the surface PSD cannot be used to calculate BRDFs from surface metrology data for even slightly rough surfaces. However, an FFTLog numerical Hankel transform algorithm enables the practical use of the computationally intensive Generalized Harvey-Shack (GHS) surface scatter theory [1] to calculate BRDFs from surface PSDs for increasingly short wavelengths that violate the smooth surface approximation implicit in the Rayleigh-Rice surface scatter theory [2-3]. The recent numerical validation [4] of the GHS theory (a generalized linear systems formulation of surface scatter theory), and an analysis of image degradation due to surface scatter in the presence of aberrations [5] has provided credence to the development of a systems engineering analysis of image quality as degraded not only by diffraction effects and geometrical aberrations, but to scattering effects due to residual optical fabrication errors as well. These advances, combined with the continuing increase in computer speed, leave us poised to fully integrate optical metrology and fabrication into the optical design process.

  13. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical

  14. An Examination of the Decision-Making Process Used by Designers in Multiple Disciplines

    ERIC Educational Resources Information Center

    Stefaniak, Jill E.; Tracey, Monica W.

    2014-01-01

    Design-thinking is an inductive and participatory process in which designers are required to manage constraints, generate solutions, and follow project timelines in order to complete project goals. The researchers used this exploration study to look at how designers in various disciplinary fields approach design projects. Designers were asked to…

  15. Design Ideas, Reflection, and Professional Identity: How Graduate Students Explore the Idea Generation Process

    ERIC Educational Resources Information Center

    Hutchinson, Alisa; Tracey, Monica W.

    2015-01-01

    Within design thinking, designers are responsible for generating, testing, and refining design ideas as a means to refine the design problem and arrive at an effective solution. Thus, understanding one's individual idea generation experiences and processes can be seen as a component of professional identity for designers, which involves the…

  16. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  17. Role of Graphics Tools in the Learning Design Process

    ERIC Educational Resources Information Center

    Laisney, Patrice; Brandt-Pomares, Pascale

    2015-01-01

    This paper discusses the design activities of students in secondary school in France. Graphics tools are now part of the capacity of design professionals. It is therefore apt to reflect on their integration into the technological education. Has the use of intermediate graphical tools changed students' performance, and if so in what direction, in…

  18. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  19. Student Evaluation of CALL Tools during the Design Process

    ERIC Educational Resources Information Center

    Nesbitt, Dallas

    2013-01-01

    This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…

  20. Development of Chemical Process Design and Control for Sustainability

    EPA Science Inventory

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy....

  1. Innovation Process Design: A Change Management and Innovation Dimension Perspective

    NASA Astrophysics Data System (ADS)

    Peisl, Thomas; Reger, Veronika; Schmied, Juergen

    The authors propose an innovative approach to the management of innovation integrating business, process, and maturity dimensions. Core element of the concept is the adaptation of ISO/IEC 15504 to the innovation process including 14 innovation drivers. Two managerial models are applied to conceptualize and visualize the respective innovation strategies, the Balanced Scorecard and a Barriers in Change Processes Model. An illustrative case study shows a practical implementation process.

  2. Enhancing healthcare process design with human factors engineering and reliability science, part 1: setting the context.

    PubMed

    Boston-Fleischhauer, Carol

    2008-01-01

    The design and implementation of efficient, effective, and safe processes are never-ending challenges in healthcare. Less than optimal performance levels and rising concerns about patient safety suggest that traditional process design methods are insufficient to meet design requirements. In this 2-part series, the author presents human factors engineering and reliability science as important knowledge to enhance existing operational and clinical process design methods in healthcare. An examination of these theories, application approaches, and examples are presented.

  3. World methanol situation poses challenge in process design

    SciTech Connect

    Haggin, J.

    1984-07-16

    A review is presented of the technology and economics of methanol production processes. Synthesis gas production based on methane or coal are compared. Since methane-based synthesis gas is hydrogen rich and coal-based synthesis gas is carbon rich, the combination of both processes, as suggested by the M.W. Kellogg Co., should be economically attractive. A liquid-phase synthesis in the developmental stages and two reactor configurations under consideration for its use are discussed. The Wentworth system of catalytic processing, a Lurgi process using coal and methane for methanol, a Lurgi process for utilizing methanol in a variation of the Mobil methanol-to-gasoline process, and another Lurgi process to produce a methanol fuel mixture for direct use as a motor fuel, consisting of methanol and oxygenates, are also discussed.

  4. The space station - An overview of the design process

    NASA Technical Reports Server (NTRS)

    Covington, C.

    1983-01-01

    The design factors being considered in the NASA space-station development program are summarized. The currently envisioned mission requirements are listed, and the system architecture is defined as a core station, mission-dedicated elements, and supporting equipment such as an orbit maneuvering vehicle. System design factors discussed include orbit selection, contamination control, autonomy, system safety, technology implementation, long life, reliability and maintainability, and cost; subsystem design factors include structural considerations, electrical power, environmental control and life support, data management, communications and tracking, onboard propulsion, habitability, and crew support. Configurational design is seen as driven by a number of factors, primarily the need to fit all components into the Shuttle payload bay for assembly in LEO by the Shuttle crew.

  5. PROCESS DESIGN MANUAL FOR LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The USEPA guidance on land treatment of municipal and industrial wastewater is updated for the first time since 1984. The significant new technilogical changes include phytoremediation, vadose zone monitoring, new design approaches to surface irrigation, center pivot irrigation,...

  6. Digital geometry processing applied in customized medical implant design.

    PubMed

    Xiao-Feng, Zhu; Cheng-Tao, Wang

    2005-01-01

    Standard medical implants are used in most implantation cases, but in some special cases, only those implants conforming to individual patient's skeletal morphology can serve the purpose. This paper proposes a new approach to design and fabricate customized exact-fit medical implants. With a real surgical case as the example, technical design details are presented; and two algorithms are given respectively for segmentation based on object features and triangular mesh defragmentation.

  7. Bed occupancy monitoring: data processing and clinician user interface design.

    PubMed

    Pouliot, Melanie; Joshi, Vilas; Goubran, Rafik; Knoefel, Frank

    2012-01-01

    Unobtrusive and continuous monitoring of patients, especially at their place of residence, is becoming a significant part of the healthcare model. A variety of sensors are being used to monitor different patient conditions. Bed occupancy monitoring provides clinicians a quantitative measure of bed entry/exit patterns and may provide information relating to sleep quality. This paper presents a bed occupancy monitoring system using a bed pressure mat sensor. A clinical trial was performed involving 8 patients to collect bed occupancy data. The trial period for each patient ranged from 5-10 weeks. This data was analyzed using a participatory design methodology incorporating clinician feedback to obtain bed occupancy parameters. The parameters extracted include the number of bed exits per night, the bed exit weekly average (including minimum and maximum), the time of day of a particular exit, and the amount of uninterrupted bed occupancy per night. The design of a clinical user interface plays a significant role in the acceptance of such patient monitoring systems by clinicians. The clinician user interface proposed in this paper was designed to be intuitive, easy to navigate and not cause information overload. An iterative design methodology was used for the interface design. The interface design is extendible to incorporate data from multiple sensors. This allows the interface to be part of a comprehensive remote patient monitoring system.

  8. Processing Systems Optimization Through Automatic Design and Reorganization of Program Modules.

    ERIC Educational Resources Information Center

    Nunamaker, J. F., Jr.; And Others

    A methodology is described for an automatic design system initially defined in terms of logical processes or program modules. Processes and files are grouped and reorganized in such a way as to produce an optimal design with respect to a specific target machine. Performance criteria for the optimal design are defined in terms of transport volume…

  9. AMS-02 antiprotons reloaded

    SciTech Connect

    Kappl, Rolf; Reinert, Annika; Winkler, Martin Wolfgang E-mail: areinert@th.physik.uni-bonn.de

    2015-10-01

    The AMS-02 collaboration has released preliminary data on the antiproton fraction in cosmic rays. The surprisingly hard antiproton spectrum at high rigidity has triggered speculations about a possible primary antiproton component originating from dark matter annihilations. In this note, we employ newly available AMS-02 boron to carbon data to update the secondary antiproton flux within the standard two-zone diffusion model. The new background permits a considerably better fit to the measured antiproton fraction compared to previous estimates. This is mainly a consequence of the smaller slope of the diffusion coefficient favored by the new AMS-02 boron to carbon data.

  10. Plutonium age dating reloaded

    NASA Astrophysics Data System (ADS)

    Sturm, Monika; Richter, Stephan; Aregbe, Yetunde; Wellum, Roger; Mayer, Klaus; Prohaska, Thomas

    2014-05-01

    Although the age determination of plutonium is and has been a pillar of nuclear forensic investigations for many years, additional research in the field of plutonium age dating is still needed and leads to new insights as the present work shows: Plutonium is commonly dated with the help of the 241Pu/241Am chronometer using gamma spectrometry; in fewer cases the 240Pu/236U chronometer has been used. The age dating results of the 239Pu/235U chronometer and the 238Pu/234U chronometer are scarcely applied in addition to the 240Pu/236U chronometer, although their results can be obtained simultaneously from the same mass spectrometric experiments as the age dating result of latter. The reliability of the result can be tested when the results of different chronometers are compared. The 242Pu/238U chronometer is normally not evaluated at all due to its sensitivity to contamination with natural uranium. This apparent 'weakness' that renders the age dating results of the 242Pu/238U chronometer almost useless for nuclear forensic investigations, however turns out to be an advantage looked at from another perspective: the 242Pu/238U chronometer can be utilized as an indicator for uranium contamination of plutonium samples and even help to identify the nature of this contamination. To illustrate this the age dating results of all four Pu/U clocks mentioned above are discussed for one plutonium sample (NBS 946) that shows no signs of uranium contamination and for three additional plutonium samples. In case the 242Pu/238U chronometer results in an older 'age' than the other Pu/U chronometers, contamination with either a small amount of enriched or with natural or depleted uranium is for example possible. If the age dating result of the 239Pu/235U chronometer is also influenced the nature of the contamination can be identified; enriched uranium is in this latter case a likely cause for the missmatch of the age dating results of the Pu/U chronometers.

  11. Oral Insulin Reloaded

    PubMed Central

    Heinemann, Lutz; Plum-Mörschel, Leona

    2014-01-01

    Optimal coverage of insulin needs is the paramount aim of insulin replacement therapy in patients with diabetes mellitus. To apply insulin without breaking the skin barrier by a needle and/or to allow a more physiological provision of insulin are the main reasons triggering the continuous search for alternative routes of insulin administration. Despite numerous attempts over the past 9 decades to develop an insulin pill, no insulin for oral dosing is commercially available. By way of a structured approach, we aim to provide a systematic update on the most recent developments toward an orally available insulin formulation with a clear focus on data from clinical-experimental and clinical studies. Thirteen companies that claim to be working on oral insulin formulations were identified. However, only 6 of these companies published new clinical trial results within the past 5 years. Interestingly, these clinical data reports make up a mere 4% of the considerably high total number of publications on the development of oral insulin formulations within this time period. While this picture clearly reflects the rising research interest in orally bioavailable insulin formulations, it also highlights the fact that the lion’s share of research efforts is still allocated to the preclinical stages. PMID:24876606

  12. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  13. Designing Advanced Ceramic Waste Forms for Electrochemical Processing Salt Waste

    SciTech Connect

    Ebert, W. L.; Snyder, C. T.; Frank, Steven; Riley, Brian

    2016-03-01

    This report describes the scientific basis underlying the approach being followed to design and develop “advanced” glass-bonded sodalite ceramic waste form (ACWF) materials that can (1) accommodate higher salt waste loadings than the waste form developed in the 1990s for EBR-II waste salt and (2) provide greater flexibility for immobilizing extreme waste salt compositions. This is accomplished by using a binder glass having a much higher Na2O content than glass compositions used previously to provide enough Na+ to react with all of the Cl– in the waste salt and generate the maximum amount of sodalite. The phase compositions and degradation behaviors of prototype ACWF products that were made using five new binder glass formulations and with 11-14 mass% representative LiCl/KCl-based salt waste were evaluated and compared with results of similar tests run with CWF products made using the original binder glass with 8 mass% of the same salt to demonstrate the approach and select a composition for further studies. About twice the amount of sodalite was generated in all ACWF materials and the microstructures and degradation behaviors confirmed our understanding of the reactions occurring during waste form production and the efficacy of the approach. However, the porosities of the resulting ACWF materials were higher than is desired. These results indicate the capacity of these ACWF waste forms to accommodate LiCl/KCl-based salt wastes becomes limited by porosity due to the low glass-to-sodalite volume ratio. Three of the new binder glass compositions were acceptable and there is no benefit to further increasing the Na content as initially planned. Instead, further studies are needed to develop and evaluate alternative production methods to decrease the porosity, such as by increasing the amount of binder glass in the formulation or by processing waste forms in a hot isostatic press. Increasing the amount of binder glass to eliminate porosity will decrease

  14. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  15. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  16. DESIGNING SUSTAINABLE PROCESSES WITH SIMULATION: THE WASTE REDUCTION (WAR) ALGORITHM

    EPA Science Inventory

    The WAR Algorithm, a methodology for determining the potential environmental impact (PEI) of a chemical process, is presented with modifications that account for the PEI of the energy consumed within that process. From this theory, four PEI indexes are used to evaluate the envir...

  17. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  18. The Role of Collaboration in a Comprehensive Programme Design Process in Inclusive Education

    ERIC Educational Resources Information Center

    Zundans-Fraser, Lucia; Bain, Alan

    2016-01-01

    This study focused on the role of collaboration in a comprehensive programme design process in inclusive education. The participants were six members of an inclusive education team and an educational designer who together comprised the design team. The study examined whether collaboration was evident in the practice of programme design and…

  19. Design alternatives for process group membership and multicast

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Cooper, Robert; Gleeson, Barry

    1991-01-01

    Process groups are a natural tool for distributed programming, and are increasingly important in distributed computing environments. However, there is little agreement on the most appropriate semantics for process group membership and group communication. These issues are of special importance in the Isis system, a toolkit for distributed programming. Isis supports several styles of process group, and a collection of group communication protocols spanning a range of atomicity and ordering properties. This flexibility makes Isis adaptable to a variety of applications, but is also a source of complexity that limits performance. This paper reports on a new architecture that arose from an effort to simplify Isis process group semantics. Our findings include a refined notion of how the clients of a group should be treated, what the properties of a multicast primitive should be when systems contain large numbers of overlapping groups, and a new construct called the casuality domain. As an illustration, we apply the architecture to the problem of converting processes into fault-tolerant process groups in a manner that is 'transparent' to other processes in the system.

  20. Development of Chemical Process Design and Control for ...

    EPA Pesticide Factsheets

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy. The implemented control strategy combines a biologically inspired method with optimal control concepts for finding more sustainable operating trajectories. The sustainability assessment of process operating points is carried out by using the U.S. E.P.A.’s Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator (GREENSCOPE) tool that provides scores for the selected indicators in the economic, material efficiency, environmental and energy areas. The indicator scores describe process performance on a sustainability measurement scale, effectively determining which operating point is more sustainable if there are more than several steady states for one specific product manufacturing. Through comparisons between a representative benchmark and the optimal steady-states obtained through implementation of the proposed controller, a systematic decision can be made in terms of whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous fermentation process for fuel production, whose materi

  1. Towards a Web-Based Handbook of Generic, Process-Oriented Learning Designs

    ERIC Educational Resources Information Center

    Marjanovic, Olivera

    2005-01-01

    Process-oriented learning designs are innovative learning activities that include a set of inter-related learning tasks and are generic (could be used across disciplines). An example includes a problem-solving process widely used in problem-based learning today. Most of the existing process-oriented learning designs are not documented, let alone…

  2. Design and manufacturing tools for laser beam processing

    NASA Astrophysics Data System (ADS)

    Kaierle, Stefan; Fuerst, B.; Kittel, Jochen; Kreutz, Ernst-Wolfgang; Poprawe, Reinhart

    1999-08-01

    Today's situation with increasingly shorter time-to-market limits and growing variant spectra calls for advanced methods in the manufacturing domain. A big potential for gaining faster and better manufacturing results lies in the application of offline programming, especially if processing small lot sizes. Offline programming offers as main advantage a notable reduction of deadlock times of manufacturing systems. Applying this technology there is no time consumptive teach-in on the robots necessary. A technology module based on CAD/CAM technique--mainly for 3D welding applications--is described which permits to carry out offline path and process planning including simulation and visualization of the processing task.

  3. An Elective Course on Computer-Aided Process Design.

    ERIC Educational Resources Information Center

    Sommerfeld, Jude T.

    1979-01-01

    Describes an undergraduate chemical engineering course which has been offered at the Georgia Institute of Technology. The objectives, structure, instructional materials and content of this course, which emphasizes the structure and usage of computer-aided design systems, are also included. (HM)

  4. Online Group Work Design: Processes, Complexities, and Intricacies

    ERIC Educational Resources Information Center

    Kleinsasser, Robert; Hong, Yi-Chun

    2016-01-01

    This paper describes the challenges of designing and implementing online group work. We are responsible for a seven-and-a-half week's online literacy and bi-literacy graduate course in a Bilingual/English as a Second Language (BLE/ESL) Master of Arts program. One of the tasks includes online literacy circle exchanges where students are encouraged…

  5. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  6. Supporting Learning with Process Tools: Theory and Design Issues.

    ERIC Educational Resources Information Center

    Goodrum, David A.; Knuth, Randy A.

    This report begins by discussing the implications of a constructivist epistemology for instructional design and development, as well as drawbacks of the content database in instruction, the lack of special tools to help the learner actively construct knowledge, and emphasis on learning environments based on individual learning. The whole learning…

  7. PROCESS DESIGN MANUAL FOR SLUDGE TREATMENT AND DISPOSAL

    EPA Science Inventory

    The purpose of this manual is to provide the engineering community and related industry with a new source of information to be used in the planning, design, and operation of present and future wastewater pollution control facilities. This manual supplements this existing knowledg...

  8. Investigating Preservice Mathematics Teachers' Manipulative Material Design Processes

    ERIC Educational Resources Information Center

    Sandir, Hakan

    2016-01-01

    Students use concrete manipulatives to form an imperative affiliation between conceptual and procedural knowledge (Balka, 1993). Hence, it is necessary to design specific mathematics manipulatives that focus on different mathematical concepts. Preservice teachers need to know how to make and use manipulatives that stimulate students' thinking as…

  9. New economic, process conditions stimulate changes in boiler design

    SciTech Connect

    Schwieger, B.

    1981-12-01

    Paper mills are specifying higher steam pressures and temperatures for their recovery boilers, and are paying greater attention to features that improve availability, to counter the increasing costs of energy and downtime, In refineries, sour feedstocks and new catalysts are forcing engineers to rethink CO-boiler design concepts.

  10. Conjecture Mapping to Optimize the Educational Design Research Process

    ERIC Educational Resources Information Center

    Wozniak, Helen

    2015-01-01

    While educational design research promotes closer links between practice and theory, reporting its outcomes from iterations across multiple contexts is often constrained by the volumes of data generated, and the context bound nature of the research outcomes. Reports tend to focus on a single iteration of implementation without further research to…

  11. Process Design Manual: Wastewater Treatment Facilities for Sewered Small Communities.

    ERIC Educational Resources Information Center

    Leffel, R. E.; And Others

    This manual attempts to describe new treatment methods, and discuss the application of new techniques for more effectively removing a broad spectrum of contaminants from wastewater. Topics covered include: fundamental design considerations, flow equalization, headworks components, clarification of raw wastewater, activated sludge, package plants,…

  12. NPS CubeSat Launcher Design, Process and Requirements

    DTIC Science & Technology

    2009-06-01

    Lithography (SLA) • Laminated Object Manufacturing • Electron Beam Melting Each of these technologies varies dramatically by materials used, price...Like many engineering projects there are often re-designs or mistakes requiring rework , so it was impractical to plan on building only ten P-POD

  13. Preparing Instructional Designers for Game-Based Learning: Part III. Game Design as a Collaborative Process

    ERIC Educational Resources Information Center

    Hirumi, Atsusi; Appelman, Bob; Rieber, Lloyd; Van Eck, Richard

    2010-01-01

    In this three part series, four professors who teach graduate level courses on the design of instructional video games discuss their perspectives on preparing instructional designers to optimize game-based learning. Part I set the context for the series and one of four panelists discussed what he believes instructional designers should know about…

  14. The Process of Designing for Learning: Understanding University Teachers' Design Work

    ERIC Educational Resources Information Center

    Bennett, Sue; Agostinho, Shirley; Lockyer, Lori

    2017-01-01

    Interest in how to support the design work of university teachers has led to research and development initiatives that include technology-based design-support tools, online repositories, and technical specifications. Despite these initiatives, remarkably little is known about the design work that university teachers actually do. This paper…

  15. Model-based design of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Stroehlein, Guido; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    In this work we present a general procedure for the model-based optimization of a polypeptide crude mixture purification process through its application to a case of industrial relevance. This is done to show how much modeling can be beneficial to optimize complex chromatographic processes in the industrial environment. The target peptide elution profile was modeled with a two sites adsorption equilibrium isotherm exhibiting two inflection points. The variation of the isotherm parameters with the modifier concentration was accounted for. The adsorption isotherm parameters of the target peptide were obtained by the inverse method. The elution of the impurities was approximated by lumping them into pseudo-impurities and by regressing their adsorption isotherm parameters directly as a function of the corresponding parameters of the target peptide. After model calibration and validation by comparison with suitable experimental data, Pareto optimizations of the process were carried out so as to select the optimal batch process.

  16. Transparent process migration: Design alternatives and the Sprite implementation

    NASA Technical Reports Server (NTRS)

    Douglis, Fred; Ousterhout, John

    1991-01-01

    The Sprite operating system allows executing processes to be moved between hosts at any time. We use this process migration mechanism to offload work onto idle machines, and also to evict migrated processes when idle workstations are reclaimed by their owners. Sprite's migration mechanism provides a high degree of transparency both for migrated processes and for users. Idle machines are identified, and eviction is invoked, automatically by daemon processes. On Sprite it takes up to a few hundred milliseconds on SPARCstation 1 workstations to perform a remote exec, while evictions typically occur in a few seconds. The pmake program uses remote invocation to invoke tasks concurrently. Compilations commonly obtain speedup factors in the range of three to six; they are limited primarily by contention for centralized resources such as file servers. CPU-bound tasks such as simulations can make more effective use of idle hosts, obtaining as much as eight-fold speedup over a period of hours. Process migration has been in regular service for over two years.

  17. New Materials Design Through Friction Stir Processing Techniques

    SciTech Connect

    Buffa, G.; Fratini, L.; Shivpuri, R.

    2007-04-07

    Friction Stir Welding (FSW) has reached a large interest in the scientific community and in the last years also in the industrial environment, due to the advantages of such solid state welding process with respect to the classic ones. The complex material flow occurring during the process plays a fundamental role in such solid state welding process, since it determines dramatic changes in the material microstructure of the so called weld nugget, which affects the effectiveness of the joints. What is more, Friction Stir Processing (FSP) is mainly being considered for producing high-strain-rate-superplastic (HSRS) microstructure in commercial aluminum alloys. The aim of the present research is the development of a locally composite material through the Friction Stir Processing (FSP) of two AA7075-T6 blanks and a different material insert. The results of a preliminary experimental campaign, carried out at the varying of the additional material placed at the sheets interface under different conditions, are presented. Micro and macro observation of the such obtained joints permitted to investigate the effects of such process on the overall joint performance.

  18. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  19. Zero-Release Mixed Waste Process Facility Design and Testing

    SciTech Connect

    Richard D. Boardman; John A. Deldebbio; Robert J. Kirkham; Martin K. Clemens; Robert Geosits; Ping Wan

    2004-02-01

    A zero-release offgas cleaning system for mixed-waste thermal treatment processes has been evaluated through experimental scoping tests and process modeling. The principles can possibly be adapted to a fluidized-bed calcination or stream reforming process, a waste melter, a rotarykiln process, and possibly other waste treatment thermal processes. The basic concept of a zero-release offgas cleaning system is to recycle the bulk of the offgas stream to the thermal treatment process. A slip stream is taken off the offgas recycle to separate and purge benign constituents that may build up in the gas, such as water vapor, argon, nitrogen, and CO2. Contaminants are separated from the slip stream and returned to the thermal unit for eventual destruction or incorporation into the waste immobilization media. In the current study, a standard packed-bed scrubber, followed by gas separation membranes, is proposed for removal of contaminants from the offgas recycle slipstream. The scrub solution is continuously regenerated by cooling and precipitating sulfate, nitrate, and other salts that reach a solubility limit in the scrub solution. Mercury is also separated by the scrubber. A miscible chemical oxidizing agent was shown to effectively oxidize mercury and also NO, thus increasing their removal efficiency. The current study indicates that the proposed process is a viable option for reducing offgas emissions. Consideration of the proposed closed-system offgas cleaning loop is warranted when emissions limits are stringent, or when a reduction in the total gas emissions volume is desired. Although the current closed-loop appears to be technically feasible, economical considerations must be also be evaluated on a case-by-case basis.

  20. Design and development of a layer-based additive manufacturing process for the realization of metal parts of designed mesostructure

    NASA Astrophysics Data System (ADS)

    Williams, Christopher Bryant

    Low-density cellular materials, metallic bodies with gaseous voids, are a unique class of materials that are characterized by their high strength, low mass, good energy absorption characteristics, and good thermal and acoustic insulation properties. In an effort to take advantage of this entire suite of positive mechanical traits, designers are tailoring the cellular mesostructure for multiple design objectives. Unfortunately, existing cellular material manufacturing technologies limit the design space as they are limited to certain part mesostructure, material type, and macrostructure. The opportunity that exists to improve the design of existing products, and the ability to reap the benefits of cellular materials in new applications is the driving force behind this research. As such, the primary research goal of this work is to design, embody, and analyze a manufacturing process that provides a designer the ability to specify the material type, material composition, void morphology, and mesostructure topology for any conceivable part geometry. The accomplishment of this goal is achieved in three phases of research: (1) Design---Following a systematic design process and a rigorous selection exercise, a layer-based additive manufacturing process is designed that is capable of meeting the unique requirements of fabricating cellular material geometry. Specifically, metal parts of designed mesostructure are fabricated via three-dimensional printing of metal oxide ceramic powder followed by post-processing in a reducing atmosphere. (2) Embodiment ---The primary research hypothesis is verified through the use of the designed manufacturing process chain to successfully realize metal parts of designed mesostructure. (3) Modeling & Evaluation ---The designed manufacturing process is modeled in this final research phase so as to increase understanding of experimental results and to establish a foundation for future analytical modeling research. In addition to an analysis of

  1. Photovoltaic/diesel hybrid systems: The design process

    NASA Astrophysics Data System (ADS)

    Jones, G. J.; Chapman, R. N.

    A photovoltaic/storage system by itself may be uneconomical for stand-alone applications with large energy demands. However, by combining the PV system with a back-up energy source, such as a diesel, gasoline, or propane/thermoelectric generator, system economics can be improved. Such PV/fossil hybrid systems are being used, but their design has required detailed modeling to determine the optimal mix of photovoltaics and back-up energy. Recent data on diesel field reliability and a new design technique for stand-alone systems have overcome this problem. The approach provides the means for sizing the photovoltaic system to obtain a near optimal hybrid system, with about a 90% savings in back-up fuel costs. System economics are determined by comparing PV capital cost to the present value of the displaced diesel operation and maintenance costs.

  2. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  3. Designing optimized industrial process analysers for closed loop control

    PubMed Central

    Grevesmuehl, Bernard; Kradjel, Cynthia; Kellner, Hanno

    1991-01-01

    Manufacturers are now looking closely at ways of optimizing ‘quality’ and increasing process efficiency while reducing manufacturing costs. Near infra-red (NIR) technology is a popular solution to this challenge: it provides manufacturers with rapid and reliable in-process analysis and thousands of systems have already been installed in the food, chemical, pharmaceutical and agricultural markets. For over 10 years, NIR has been successfully applied to at-line process analysis. Rugged and easy-to-operate filter analysers are traditionally located in the control room–process operators can then ‘grab samples’ and obtain results in less than a minute. There are many practical advantages to using at-line filter systems. Products from many lines can be run on one system, and, since there is no direct process interface, installation, operation and maintenance are quite simple. Many manufacturers, however, are now striving to achieve on-line closed loop control, in these cases the benefit of obtaining continuous measurement is well worth the effort required to automate the analysis. PMID:18924898

  4. Integrating the Affective Domain into the Instructional Design Process

    DTIC Science & Technology

    1992-03-01

    domains. Yet, instructional design models and practices have focused primarily on the acquisition of knowledge and psychomotor skills . Concern for the...the skills involved in the reactive and interactive domains are as amenable to the general principles of instruction as are cognitive and psychomotor ... skills . He also sees a parallel between the automation of affective domain skills (reflexive, conditioned activity versus behavior resulting from a

  5. Nencki Affective Word List (NAWL): the cultural adaptation of the Berlin Affective Word List-Reloaded (BAWL-R) for Polish.

    PubMed

    Riegel, Monika; Wierzba, Małgorzata; Wypych, Marek; Żurawski, Łukasz; Jednoróg, Katarzyna; Grabowska, Anna; Marchewka, Artur

    2015-12-01

    In the present article, we introduce the Nencki Affective Word List (NAWL), created in order to provide researchers with a database of 2,902 Polish words, including nouns, verbs, and adjectives, with ratings of emotional valence, arousal, and imageability. Measures of several objective psycholinguistic features of the words (frequency, grammatical class, and number of letters) are also controlled. The database is a Polish adaptation of the Berlin Affective Word List-Reloaded (BAWL-R; Võ et al., Behavior Research Methods 41:534-538, 2009), commonly used to investigate the affective properties of German words. Affective normative ratings were collected from 266 Polish participants (136 women and 130 men). The emotional ratings and psycholinguistic indexes provided by NAWL can be used by researchers to better control the verbal materials they apply and to adjust them to specific experimental questions or issues of interest. The NAWL is freely accessible to the scientific community for noncommercial use as supplementary material to this article.

  6. Microstructure Sensitive Design and Processing in Solid Oxide Electrolyzer Cell

    SciTech Connect

    Dr. Hamid Garmestani; Dr. Stephen Herring

    2009-06-12

    The aim of this study was to develop and inexpensive manufacturing process for deposition of functionally graded thin films of LSM oxides with porosity graded microstructures for use as IT-SOFCs cathode. The spray pyrolysis method was chosen as a low-temperature processing technique for deposition of porous LSM films onto dense YXZ substrates. The effort was directed toward the optimization of the processing conditions for deposition of high quality LSM films with variety of morphologies in the range of dense to porous microstructures. Results of optimization studies of spray parameters revealed that the substrate surface temperature is the most critical parameter influencing the roughness and morphology, porosity, cracking and crystallinity of the film.

  7. The amount of ergonomics and user involvement in 151 design processes.

    PubMed

    Kok, Barbara N E; Slegers, Karin; Vink, Peter

    2012-01-01

    Ergonomics, usability and user-centered design are terms that are well known among designers. Yet, products often seem to fail to meet the users' needs, resulting in a gap between expected and experienced usability. To understand the possible causes of this gap the actions taken by the designer during the design process are studied in this paper. This can show whether and how certain actions influence the user-friendliness of the design products. The aim of this research was to understand whether ergonomic principles and methods are included in the design process, whether users are involved in this process and whether the experience of the designer (in ergonomics/user involvement) has an effect on the end product usability. In this study the design processes of 151 tangible products of students in design were analyzed. It showed that in 75% of the cases some ergonomic principles were applied. User involvement was performed in only 1/3 of the design cases. Hardly any correlation was found between the designers' experience in ergonomic principles and the way they applied it and no correlations were found between the designers' experience in user involvement and the users' involvement in the design process.

  8. Seventeen Projects Carried out by Students Designing for and with Disabled Children: Identifying Designers' Difficulties during the Whole Design Process

    ERIC Educational Resources Information Center

    Magnier, Cecile; Thomann, Guillaume; Villeneuve, Francois

    2012-01-01

    This article aims to identify the difficulties that may arise when designing assistive devices for disabled children. Seventeen design projects involving disabled children, engineering students, and special schools were analysed. A content analysis of the design reports was performed. For this purpose, a coding scheme was built based on a review…

  9. Guidelines for Designing and Managing a Planning Process.

    ERIC Educational Resources Information Center

    Miller, John Edgar; Thompson, Hugh L.

    Management guidelines for colleges and universities that include policies, goals, and procedures are presented. Guidelines include the following: (1) trustees must have final authority for the institution; (2) the planning process should recognize that policy and direction generally flow down (the organizational structure) while process…

  10. Waste Feed Delivery Purex Process Connector Design Pressure

    SciTech Connect

    BRACKENBURY, P.J.

    2000-04-11

    The pressure retaining capability of the PUREX process connector is documented. A context is provided for the connector's current use within existing Projects. Previous testing and structural analyses campaigns are outlined. The deficient condition of the current inventory of connectors and assembly wrenches is highlighted. A brief history of the connector is provided. A bibliography of pertinent references is included.

  11. EVALUATING AND DESIGNING CHEMICAL PROCESSES FOR ENVIRONMENTAL SUSTAINABILITY

    EPA Science Inventory

    Chemicals and chemical processes are at the heart of most environmental problems. This isn't surprising since chemicals make up all of the products we use in our lives. The common use of cjhemicals makes them of high interest for systems analysis, particularly because of environ...

  12. Process-Oriented Design: Conversational Interfaces for Global Accessibility

    ERIC Educational Resources Information Center

    Robertson, Amanda

    2005-01-01

    The ability of the Internet to serve as a bridge to cultural understanding relies in great part on issues related to accessibility. My focus in this article is on accessibility as it relates to providing individuals with the full capabilities of the Internet to facilitate a process of association and learning, which can alleviate many issues that…

  13. INCORPORATING ENVIRONMENTAL AND ECONOMIC CONSIDERATIONS INTO PROCESS DESIGN: THE WASTE REDUCTION (WAR) ALGORITHM

    EPA Science Inventory

    A general theory known as the WAste Reduction (WASR) algorithm has been developed to describe the flow and the generation of potential environmental impact through a chemical process. This theory integrates environmental impact assessment into chemical process design Potential en...

  14. Design for human factors (DfHF): a grounded theory for integrating human factors into production design processes.

    PubMed

    Village, Judy; Searcy, Cory; Salustri, Filipo; Patrick Neumann, W

    2015-01-01

    The 'design for human factors' grounded theory explains 'how' human factors (HF) went from a reactive, after-injury programme in safety, to being proactively integrated into each step of the production design process. In this longitudinal case study collaboration with engineers and HF Specialists in a large electronics manufacturer, qualitative data (e.g. meetings, interviews, observations and reflections) were analysed using a grounded theory methodology. The central tenet in the theory is that when HF Specialists acclimated to the engineering process, language and tools, and strategically aligned HF to the design and business goals of the organisation, HF became a means to improve business performance. This led to engineers 'pulling' HF Specialists onto their team. HF targets were adopted into engineering tools to communicate HF concerns quantitatively, drive continuous improvement, visibly demonstrate change and lead to benchmarking. Senior management held engineers accountable for HF as a key performance indicator, thus integrating HF into the production design process. Practitioner Summary: Research and practice lack explanations about how HF can be integrated early in design of production systems. This three-year case study and the theory derived demonstrate how ergonomists changed their focus to align with design and business goals to integrate HF into the design process.

  15. Design of Concurrency Controls for Transaction Processing Systems.

    DTIC Science & Technology

    1982-04-02

    is WM*ru~ faoe wif a confust 466"LlV~s has been a prollaratlo of pNRoposed methds but the perlnarnpa co variosMsW g under dffn *Vetoms and...use, This is Meoan to be falsity co --solIse for policy development and exp elk % ntallon. The g ~ba memory mna-ger designs support multi-verion obects...oplie tramcdo - pronesinM Vydm wo developed- fer G ~MŘ S dirIbu multl.mlcrop ocesor. This otou used rramcnc enmcl in W I ll functions required by the

  16. Process Design Report for Stover Feedstock: Lignocellulosic Biomass to Ethanol Process Design and Economics Utilizing Co-Current Dilute Acid Prehydrolysis and Enzymatic Hydrolysis for Corn Stover

    SciTech Connect

    Aden, A.; Ruth, M.; Ibsen, K.; Jechura, J.; Neeves, K.; Sheehan, J.; Wallace, B.; Montague, L.; Slayton, A.; Lukas, J.

    2002-06-01

    The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update of the ongoing process design and economic analyses at NREL.

  17. Robust design of binary countercurrent adsorption separation processes

    SciTech Connect

    Storti, G. ); Mazzotti, M.; Morbidelli, M.; Carra, S. )

    1993-03-01

    The separation of a binary mixture, using a third component having intermediate adsorptivity as desorbent, in a four section countercurrent adsorption separation unit is considered. A procedure for the optimal and robust design of the unit is developed in the frame of Equilibrium Theory, using a model where the adsorption equilibria are described through the constant selectivity stoichiometric model, while mass-transfer resistances and axial mixing are neglected. By requiring that the unit achieves complete separation, it is possible to identify a set of implicity constraints on the operating parameters, that is, the flow rate ratios in the four sections of the unit. From these constraints explicit bounds on the operating parameters are obtained, thus yielding a region in the operating parameters space, which can be drawn a priori in terms of the adsorption equilibrium constants and the feed composition. This result provides a very convenient tool to determine both optimal and robust operating conditions. The latter issue is addressed by first analyzing the various possible sources of disturbances, as well as their effect on the separation performance. Next, the criteria for the robust design of the unit are discussed. Finally, these theoretical findings are compared with a set of experimental results obtained in a six port simulated moving bed adsorption separation unit operated in the vapor phase.

  18. INTEC CPP-603 Basin Water Treatment System Closure: Process Design

    SciTech Connect

    Kimmitt, Raymond Rodney; Faultersack, Wendell Gale; Foster, Jonathan Kay; Berry, Stephen Michael

    2002-09-01

    This document describes the engineering activities that have been completed in support of the closure plan for the Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603 Basin Water Treatment System. This effort includes detailed assessments of methods and equipment for performing work in four areas: 1. A cold (nonradioactive) mockup system for testing equipment and procedures for vessel cleanout and vessel demolition. 2. Cleanout of process vessels to meet standards identified in the closure plan. 3. Dismantlement and removal of vessels, should it not be possible to clean them to required standards in the closure plan. 4. Cleanout or removal of pipelines and pumps associated with the CPP-603 basin water treatment system. Cleanout standards for the pipes will be the same as those used for the process vessels.

  19. An interfaces approach to TES ground data system processing design with the Science Investigator-led Processing System (SIPS)

    NASA Technical Reports Server (NTRS)

    Kurian, R.; Grifin, A.

    2002-01-01

    Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

  20. Design, processing and testing of LSI arrays: Hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.

    1979-01-01

    Mathematical cost factors were generated for both hybrid microcircuit and printed wiring board packaging methods. A mathematical cost model was created for analysis of microcircuit fabrication costs. The costing factors were refined and reduced to formulae for computerization. Efficient methods were investigated for low cost packaging of LSI devices as a function of density and reliability. Technical problem areas such as wafer bumping, inner/outer leading bonding, testing on tape, and tape processing, were investigated.

  1. Process options for nominal 2-K helium refrigeration system designs

    NASA Astrophysics Data System (ADS)

    Knudsen, Peter; Ganni, Venkatarao

    2012-06-01

    Nominal 2-K helium refrigeration systems are frequently used for superconducting radio frequency and magnet string technologies used in accelerators. This paper examines the trade-offs and approximate performance of four basic types of processes used for the refrigeration of these technologies; direct vacuum pumping on a helium bath, direct vacuum pumping using full or partial refrigeration recovery, cold compression, and hybrid compression (i.e., a blend of cold and warm sub-atmospheric compression).

  2. Process Options for Nominal 2-K Helium Refrigeration System Designs

    SciTech Connect

    Peter Knudsen, Venkatarao Ganni

    2012-07-01

    Nominal 2-K helium refrigeration systems are frequently used for superconducting radio frequency and magnet string technologies used in accelerators. This paper examines the trade-offs and approximate performance of four basic types of processes used for the refrigeration of these technologies; direct vacuum pumping on a helium bath, direct vacuum pumping using full or partial refrigeration recovery, cold compression, and hybrid compression (i.e., a blend of cold and warm sub-atmospheric compression).

  3. Concept Study for Military Port Design Using Natural Processes.

    DTIC Science & Technology

    1982-06-15

    asymmetric pyrolysis of ammonium salts to form NH3 plus ammonium acid salts and added operations involving carbonation, and non-core process steps...drying of ammonium salts for pyrolysis in step 5. A second stage absorption may be desirable to upgrade the sulfate content by depressurizing and...strength than used in step 1. The bisulfate or bifluoride ion is used to regenerate the resin in acid form, and the acid is eluted with ammonium or sodium

  4. Geothermal injection treatment: process chemistry, field experiences, and design options

    SciTech Connect

    Kindle, C.H.; Mercer, B.W.; Elmore, R.P.; Blair, S.C.; Myers, D.A.

    1984-09-01

    The successful development of geothermal reservoirs to generate electric power will require the injection disposal of approximately 700,000 gal/h (2.6 x 10/sup 6/ 1/h) of heat-depleted brine for every 50,000 kW of generating capacity. To maintain injectability, the spent brine must be compatible with the receiving formation. The factors that influence this brine/formation compatibility and tests to quantify them are discussed in this report. Some form of treatment will be necessary prior to injection for most situations; the process chemistry involved to avoid and/or accelerate the formation of precipitate particles is also discussed. The treatment processes, either avoidance or controlled precipitation approaches, are described in terms of their principles and demonstrated applications in the geothermal field and, when such experience is limited, in other industrial use. Monitoring techniques for tracking particulate growth, the effect of process parameters on corrosion and well injectability are presented. Examples of brine injection, preinjection treatment, and recovery from injectivity loss are examined and related to the aspects listed above.

  5. Towards health care process description framework: an XML DTD design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Aymard, S; Fieschi, D; Fieschi, M

    2001-01-01

    The development of health care and hospital information systems has to meet users needs as well as requirements such as the tracking of all care activities and the support of quality improvement. The use of process-oriented analysis is of-value to provide analysts with: (i) a systematic description of activities; (ii) the elicitation of the useful data to perform and record care tasks; (iii) the selection of relevant decision-making support. But paper-based tools are not a very suitable way to manage and share the documentation produced during this step. The purpose of this work is to propose a method to implement the results of process analysis according to XML techniques (eXtensible Markup Language). It is based on the IDEF0 activity modeling language (Integration DEfinition for Function modeling). A hierarchical description of a process and its components has been defined through a flat XML file with a grammar of proper metadata tags. Perspectives of this method are discussed.

  6. Making Explicit in Design Education: Generic Elements in the Design Process

    ERIC Educational Resources Information Center

    van Dooren, Elise; Boshuizen, Els; van Merriënboer, Jeroen; Asselbergs, Thijs; van Dorst, Machiel

    2014-01-01

    In general, designing is conceived as a complex, personal, creative and open-ended skill. Performing a well-developed skill is mainly an implicit activity. In teaching, however, it is essential to make explicit. Learning a complex skill like designing is a matter of doing and becoming aware how to do it. For teachers and students therefore, it…

  7. Advanced Simulation Technology to Design Etching Process on CMOS Devices

    NASA Astrophysics Data System (ADS)

    Kuboi, Nobuyuki

    2015-09-01

    Prediction and control of plasma-induced damage is needed to mass-produce high performance CMOS devices. In particular, side-wall (SW) etching with low damage is a key process for the next generation of MOSFETs and FinFETs. To predict and control the damage, we have developed a SiN etching simulation technique for CHxFy/Ar/O2 plasma processes using a three-dimensional (3D) voxel model. This model includes new concepts for the gas transportation in the pattern, detailed surface reactions on the SiN reactive layer divided into several thin slabs and C-F polymer layer dependent on the H/N ratio, and use of ``smart voxels''. We successfully predicted the etching properties such as the etch rate, polymer layer thickness, and selectivity for Si, SiO2, and SiN films along with process variations and demonstrated the 3D damage distribution time-dependently during SW etching on MOSFETs and FinFETs. We confirmed that a large amount of Si damage was caused in the source/drain region with the passage of time in spite of the existing SiO2 layer of 15 nm in the over etch step and the Si fin having been directly damaged by a large amount of high energy H during the removal step of the parasitic fin spacer leading to Si fin damage to a depth of 14 to 18 nm. By analyzing the results of these simulations and our previous simulations, we found that it is important to carefully control the dose of high energy H, incident energy of H, polymer layer thickness, and over-etch time considering the effects of the pattern structure, chamber-wall condition, and wafer open area ratio. In collaboration with Masanaga Fukasawa and Tetsuya Tatsumi, Sony Corporation. We thank Mr. T. Shigetoshi and Mr. T. Kinoshita of Sony Corporation for their assistance with the experiments.

  8. Design and image processing for tactile endoscope system

    NASA Astrophysics Data System (ADS)

    Yamada, Kenji; Susuki, Yuto; Nagakura, Toshiaki; Ishihara, Ken; Ohno, Yuko

    2010-08-01

    We have developed new type tactile endoscope with silicone rubber membrane. The system consists of silicone rubber membrane, image sensor and illumination system. A surface of the Silicone rubber membrane has any patterns which made by nanotechnology. This pattern is deformed by pressing tissue such as cancer, colon and so on. The deformed pattern is captured by image sensor. This pattern is analyzed by image processing. In this paper, the proposed architecture is presented. With several test targets, the characteristics of the prototype systems are evaluated in the computation simulation.

  9. The Use of Executive Control Processes in Engineering Design by Engineering Students and Professional Engineers

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Johnson, Scott D.

    2012-01-01

    A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…

  10. A Comparison of Diary Method Variations for Enlightening Form Generation in the Design Process

    ERIC Educational Resources Information Center

    Babapour, Maral; Rehammar, Bjorn; Rahe, Ulrike

    2012-01-01

    This paper presents two studies in which an empirical approach was taken to understand and explain form generation and decisions taken in the design process. In particular, the activities addressing aesthetic aspects when exteriorising form ideas in the design process have been the focus of the present study. Diary methods were the starting point…

  11. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  12. A Tutorial Design Process Applied to an Introductory Materials Engineering Course

    ERIC Educational Resources Information Center

    Rosenblatt, Rebecca; Heckler, Andrew F.; Flores, Katharine

    2013-01-01

    We apply a "tutorial design process", which has proven to be successful for a number of physics topics, to design curricular materials or "tutorials" aimed at improving student understanding of important concepts in a university-level introductory materials science and engineering course. The process involves the identification…

  13. Analyzing Team Based Engineering Design Process in Computer Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Lee, Dong-Kuk; Lee, Eun-Sang

    2016-01-01

    The engineering design process has been largely implemented in a collaborative project format. Recently, technological advancement has helped collaborative problem solving processes such as engineering design to have efficient implementation using computers or online technology. In this study, we investigated college students' interaction and…

  14. 30 CFR 910.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surface coal mining operations. 910.764 Section 910.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE GEORGIA § 910.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  15. 30 CFR 903.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surface coal mining operations. 903.764 Section 903.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE ARIZONA § 903.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  16. 30 CFR 905.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surface coal mining operations. 905.764 Section 905.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE CALIFORNIA § 905.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  17. 30 CFR 912.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surface coal mining operations. 912.764 Section 912.764 Mineral Resources OFFICE OF SURFACE MINING... WITHIN EACH STATE IDAHO § 912.764 Process for designating areas unsuitable for surface coal mining operations. Part 764 of this chapter, State Processes for Designating Areas Unsuitable for Surface...

  18. Integrating ergonomics in design processes: a case study within an engineering consultancy firm.

    PubMed

    Sørensen, Lene Bjerg; Broberg, Ole

    2012-01-01

    This paper reports on a case study within an engineering consultancy firm, where engineering designers and ergonomists were working together on the design of a new hospital sterile processing plant. The objective of the paper is to gain a better understanding of the premises for integrating ergonomics into engineering design processes and how different factors either promote or limit the integration. Based on a grounded theory approach a model illustrating these factors is developed and different hypotheses about how these factors either promote and/or limit the integration of ergonomics into design processes is presented along with the model.

  19. Design and Construction Process of Two LEED Certified University Buildings: A Collective Case Study

    ERIC Educational Resources Information Center

    Rich, Kim

    2011-01-01

    This study was conducted at the early stages of integrating LEED into the design process in which a clearer understanding of what sustainable and ecological design was about became evident through the duration of designing and building of two academic buildings on a university campus. In this case study, due to utilizing a grounded theory…

  20. Design for Review - Applying Lessons Learned to Improve the FPGA Review Process

    NASA Technical Reports Server (NTRS)

    Figueiredo, Marco A.; Li, Kenneth E.

    2014-01-01

    Flight Field Programmable Gate Array (FPGA) designs are required to be independently reviewed. This paper provides recommendations to Flight FPGA designers to properly prepare their designs for review in order to facilitate the review process, and reduce the impact of the review time in the overall project schedule.

  1. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  2. Design considerations for solar industrial process heat systems: nontracking and line focus collector technologies

    SciTech Connect

    Kutscher, C.F.

    1981-03-01

    Items are listed that should be considered in each aspect of the design of a solar industrial process heat system. The collector technologies covered are flat-plate, evacuated tube, and line focus. Qualitative design considerations are stressed rather than specific design recommendations. (LEW)

  3. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  4. Yucca Mountain Project: ESF Title I design control process review report

    SciTech Connect

    1989-01-19

    The Exploratory Shaft Facility (ESF) Title 1 Design Control Process Review was initiated in response to direction from the Office of Civilian Radioactive Waste Management (OCRWM) (letter: Kale to Gertz, NRC Concerns on Title 1 Design Control Process, November 17, 1988). The direction was to identify the existing documentation that described ``{hor_ellipsis} the design control process and the quality assurance that governed {hor_ellipsis}`` (a) the development of the requirements documents for the ESF design, (b) the various interfaces between activities, (c) analyses and definitions leading to additional requirements in the System Design Requirements Documents and, (d) completion of Title 1 Design. This report provides historical information for general use in determining the extent of the quality assurance program in existence during the ESF Title 1 Design.

  5. Development of Conceptual Design Support Tool Founded on Formalization of Conceptual Design Process for Regenerative Life Support Systems

    NASA Astrophysics Data System (ADS)

    Miyajima, Hiroyuki; Yuhara, Naohiro

    Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.

  6. Target design for materials processing very far from equilibrium

    NASA Astrophysics Data System (ADS)

    Barnard, John J.; Schenkel, Thomas

    2016-10-01

    Local heating and electronic excitations can trigger phase transitions or novel material states that can be stabilized by rapid quenching. An example on the few nanometer scale are phase transitions induced by the passage of swift heavy ions in solids where nitrogen-vacancy color centers form locally in diamonds when ions heat the diamond matrix to warm dense matter conditions at 0.5 eV. We optimize mask geometries for target materials such as silicon and diamond to induce phase transitions by intense ion pulses (e. g. from NDCX-II or from laser-plasma acceleration). The goal is to rapidly heat a solid target volumetrically and to trigger a phase transition or local lattice reconstruction followed by rapid cooling. The stabilized phase can then be studied ex situ. We performed HYDRA simulations that calculate peak temperatures for a series of excitation conditions and cooling rates of crystal targets with micro-structured masks. A simple analytical model, that includes ion heating and radial, diffusive cooling, was developed that agrees closely with the HYDRA simulations. The model gives scaling laws that can guide the design of targets over a wide range of parameters including those for NDCX-II and the proposed BELLA-i. This work was performed under the auspices of the U.S. DOE under contracts DE-AC52-07NA27344 (LLNL), DE-AC02-05CH11231 (LBNL) and was supported by the US DOE Office of Science, Fusion Energy Sciences. LLNL-ABS-697271.

  7. Design criteria for Waste Coolant Processing Facility and preliminary proposal 722 for Waste Coolant Processing Facility

    SciTech Connect

    Not Available

    1991-09-27

    This document contains the design criteria to be used by the architect-engineer (A-E) in the performance of Titles 1 and 2 design for the construction of a facility to treat the biodegradable, water soluble, waste machine coolant generated at the Y-12 plant. The purpose of this facility is to reduce the organic loading of coolants prior to final treatment at the proposed West Tank Farm Treatment Facility.

  8. Transition to an individual-room NICU design: process and outcome measures.

    PubMed

    Milford, Cheryl A; Zapalo, Barbara J; Davis, Glenda

    2008-01-01

    Redesign of a neonatal intensive care unit is a major budget undertaking, demanding accountability for its equipment and feasibility of design. It must be philosophically based and driven by research supporting best practice. The NJCU at the Magee-Womens Hospital of the University of Pittsburgh Medical Center, a Level III, 74-bed unit, has made the change from a ward design to an individual-room design suitable for family-centered, developmentally supportive care. This article presents the design process as it occurred. Unique to this process are the involvement of NJCU-graduate families and the use of transition teams. Guidelines and recommendations are offered to others interested in designing and practicing in an individual-room NJCU. Outcome data demonstrate staff adjustment to the new design and practice model. A comparison of this NICU design is made with the Recommended Standards for Newborn ICU Design.

  9. Design and processing of all-oxide composites

    SciTech Connect

    Lundberg, R.; Eckerbom, L.

    1995-12-01

    All-oxide ceramic composites as a material with potential for long life-time applications at temperatures in the 1400-1600{degrees}C range in combustion environments were studied. The properties of available polycrystalline and single crystal oxide fibres were summarised. The literature on stable weak interfaces in all-oxide composites was reviewed. Composites with single crystal fibres, a polycrystalline matrix of the same material as the fibres, and a compatible high temperature stable weak oxide interphase was suggested to be the most promising approach. Processing of all-oxide composites was performed. ZrO{sub 2}-coated sapphire fibres in reaction bonded alumina and in hot pressed alumina showed crack deflection and fibre pull-out. In reaction bonded mullite crack deflection and pull-out was observed even for un-coated sapphire fibres. This was attributed to thermal expansion mismatch. A recently started European project aiming at development, scale-up and property evaluation of all-oxide composites is briefly outlined.

  10. An integral design strategy combining optical system and image processing to obtain high resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun

    2016-05-01

    In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.

  11. Some trends and proposals for the inclusion of sustainability in the design of manufacturing process

    NASA Astrophysics Data System (ADS)

    Fradinho, J.; Nedelcu, D.; Gabriel-Santos, A.; Gonçalves-Coelho, A.; Mourão, A.

    2015-11-01

    Production processes are designed to meet requirements of three different natures, quality, cost and time. Environmental concerns have expanded the field of conceptual design through the introduction of sustainability requirements that are driven by the growing societal thoughtfulness about environmental issues. One could say that the major concern has been the definition of metrics or indices for sustainability. However, those metrics usually have some lack of consistency. More than ever, there is a need for an all-inclusive view at any level of decision-making, from the establishing of the design requirements to the implementation of the solutions. According to the Axiomatic Design Theory, sustainable designs are usually coupled designs that should be avoided. This raises a concern related to the very nature of sustainability: the cross effects between the actions that should be considered in the attempt to decouple the design solutions. In terms of production, one should clarify the characterization of the sustainability of production systems. The objectives of this paper are: i) to analyse some trends for approaching the sustainability of the production processes; ii) to define sustainability in terms of requirements for the design of the production processes; iii) to make some proposals based on the Axiomatic Design Theory, in order to establish the principles with which the guidelines for designing production processes must comply; iv) to discuss how to introduce this matter in teaching both manufacturing technology and design of production systems.

  12. Analysis of Design-Build Processes, Best Practices, and Applications to the Department of Defense

    DTIC Science & Technology

    2006-06-01

    Order requirements such as ATFP, sustainable design, and LEED; and that regional market conditions be utilized to determine minimum quality...outlining a new design-build procurement process titled “ Market Style Design-Build”. One of the outcomes proposed by this working group was to...responsibility of the acquisition team during acquisition planning and market research. The contracting team should review the latest completed design

  13. HAL/SM system functional design specification. [systems analysis and design analysis of central processing units

    NASA Technical Reports Server (NTRS)

    Ross, C.; Williams, G. P. W., Jr.

    1975-01-01

    The functional design of a preprocessor, and subsystems is described. A structure chart and a data flow diagram are included for each subsystem. Also a group of intermodule interface definitions (one definition per module) is included immediately following the structure chart and data flow for a particular subsystem. Each of these intermodule interface definitions consists of the identification of the module, the function the module is to perform, the identification and definition of parameter interfaces to the module, and any design notes associated with the module. Also described are compilers and computer libraries.

  14. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  15. Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin; Hull, Patrick V.

    2015-01-01

    Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.

  16. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    SciTech Connect

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  17. Quality by design approach of a pharmaceutical gel manufacturing process, part 1: determination of the design space.

    PubMed

    Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalá, Manel

    2011-10-01

    This work was conducted in the framework of a quality by design project involving the production of a pharmaceutical gel. Preliminary work included the identification of the quality target product profiles (QTPPs) from historical values for previously manufactured batches, as well as the critical quality attributes for the process (viscosity and pH), which were used to construct a D-optimal experimental design. The experimental design comprised 13 gel batches, three of which were replicates at the domain center intended to assess the reproducibility of the target process. The viscosity and pH models established exhibited very high linearity and negligible lack of fit (LOF). Thus, R(2) was 0.996 for viscosity and 0.975 for pH, and LOF was 0.53 for the former parameter and 0.84 for the latter. The process proved reproducible at the domain center. Water content and temperature were the most influential factors for viscosity, and water content and acid neutralized fraction were the most influential factors for pH. A desirability function was used to find the best compromise to optimize the QTPPs. The body of information was used to identify and define the design space for the process. A model capable of combining the two response variables into a single one was constructed to facilitate monitoring of the process.

  18. Virtual Display Design and Evaluation of Clothing: A Design Process Support System

    ERIC Educational Resources Information Center

    Zhang, Xue-Fang; Huang, Ren-Qun

    2014-01-01

    This paper proposes a new computer-aided educational system for clothing visual merchandising and display. It aims to provide an operating environment that supports the various stages of display design in a user-friendly and intuitive manner. First, this paper provides a brief introduction to current software applications in the field of…

  19. A Model of Creative Design Process for Fostering Creativity of Students in Design Education

    ERIC Educational Resources Information Center

    Wong, Yi Lin; Siu, Kin Wai Michael

    2012-01-01

    Creativity, which is concerned with problem solving, is essential if we are to generate new solutions to the massive and complex problems in the unknown future. Our next generation needs an educational platform where they can be taught to possess creativity. Design education is such a way to foster students' creativity. Therefore, it is essential…

  20. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  1. Establishment of a design space for biopharmaceutical purification processes using DoE.

    PubMed

    Amadeo, Ignacio; Mauro, Laura; Ortí, Eduardo; Forno, Guillermina

    2014-01-01

    Recent trends in the pharmaceutical sector are changing the way protein purification processes are designed and executed, moving from operating the process in a fixed point to allowing a permissible region in the operating space known as design space. This trend is driving product development to design quality into the manufacturing process (Quality by Design) and not to rely exclusively on testing quality in the product. A typical purification step has numerous operating parameters that can impact its performance. Therefore, optimization and robustness analysis in purification processes can be time-consuming since they are mainly grounded on experimental work. A valuable approach consists in the combination of an adequate risk analysis technique for selecting the relevant factors influencing process performance and the design of experiment methodology. The latter allows for many process variables which can be studied at the same time; thus, the number of tests will be reduced in comparison with the conventional approach based on trial and error. These multivariate studies permit a detailed exploration in the experimental range and lay the foundation of Quality by Design principles application. This article outlines a recommended sequence of activities toward the establishment of an expanded design space for a purification process.

  2. Design Considerations of Polishing Lap for Computer-Controlled Cylindrical Polishing Process

    NASA Technical Reports Server (NTRS)

    Khan, Gufran S.; Gubarev, Mikhail; Arnold, William; Ramsey, Brian D.

    2009-01-01

    This paper establishes a relationship between the polishing process parameters and the generation of mid spatial-frequency error. The consideration of the polishing lap design to optimize the process in order to keep residual errors to a minimum and optimization of the process (speeds, stroke, etc.) and to keep the residual mid spatial-frequency error to a minimum, is also presented.

  3. Model-based design space determination of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    Operating a chemical process at fixed operating conditions often leads to suboptimal process performances. It is important in fact to be able to vary the process operating conditions depending upon possible changes in feed composition, products requirements or economics. This flexibility in the manufacturing process was facilitated by the publication of the PAT initiative from the U.S. FDA [1]. In this work, the implementation of Quality-by-design in the development of a chromatographic purification process is discussed. A procedure to determine the design space of the process using chromatographic modeling is presented. Moreover, the risk of batch failure and the critical process parameters (CPP) are assessed by modeling. The ideal cut strategy is adopted and therefore only yield and productivity are considered as critical quality attributes (CQA). The general trends in CQA variations within the design space are discussed. The effect of process disturbances is also considered. It is shown that process disturbances significantly decrease the design space and that only simultaneous and specific changes in multiple process parameters (i.e. critical process parameters (CPP) lead to batch failure. The reliability of the obtained results is proven by comparing the model predictions to suitable experimental data. The case study presented in this work proves the reliability of process development using a model-based approach.

  4. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  5. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  6. Column flotation: Processes, designs and practices. Process engineering for the chemical, metals and minerals industry, Volume 2

    SciTech Connect

    Rubinstein, J.B. . Flotation Equipment and Process Engineering Dept.)

    1994-01-01

    Practically all mined ores of non-ferrous and rare metals and an increasing share of industrial minerals and coal are processed through flotation. This book presents the analysis of a wide range of problems in the process theory of flotation columns, including the first published analysis of models of flotation froths. The experience of pilot tests and commercial applications of column flotation for mineral processing and in waste water treatment circuits are also considered. This is the first book to consider column flotation design and operation experience and to present data on column parameters. Topics include: design of flotation columns; aerators in flotation columns; experimental methods of column aerohydrodynamics investigation; aerohydrodynamic characteristics of flotation columns; experimental investigation of the flotation process in columns; kinetics aspects of column flotation; scaling-up methods for flotation columns; structure and mass transfer in flotation froths; column flotation practice; and column flotation control.

  7. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  8. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods.

  9. Detailed design procedure for solar industrial-process-heat systems: overview

    SciTech Connect

    Kutscher, C F

    1982-12-01

    A large number of handbooks have been written on the subject of designing solar heating and cooling systems for buildings. One of these is summarized here. Design Approaches for Solar Industrial Process Heat Systems, published in September 1982, addresses the complete spectrum of problems associated with the design of a solar IPH system. A highly general method, derived from computer simulations, is presented for determining actual energy delivered to the process load. Also covered are siting and selection of subsystem components, cost estimation, safety and environmental considerations, and installation concerns. An overview of the design methodology developed is given and some specific examples of technical issues addressed are provided.

  10. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  11. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  12. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  13. RF design and processing of a power coupler for third harmonic superconducting cavities

    SciTech Connect

    Li, Jianjian; Harms, Elvin; Kubicki, Tom; Nicklaus, Dennis; Olis, Daniel; Prieto, Peter; Reid, John; Solyak, Nikolay; Wong, Thomas; /IIT, Chicago

    2007-06-01

    The FLASH user facility providing free electron laser radiation is built based on the TTF project at DESY. Fermilab has the responsibility for the design and processing of a third harmonic, 3.9 GHz, superconducting cavity which is powered via a coaxial power coupler. Six power couplers have been manufactured at CPI after successful design of the power coupler including RF simulation, multipacting calculation, and thermal analysis. The power couplers are being tested and processed with high pulsed power in an elaborate test stand at Fermilab now. This paper presents the RF design and processing work of the power coupler.

  14. The Iterative Design Process in Research and Development: A Work Experience Paper

    NASA Technical Reports Server (NTRS)

    Sullivan, George F. III

    2013-01-01

    The iterative design process is one of many strategies used in new product development. Top-down development strategies, like waterfall development, place a heavy emphasis on planning and simulation. The iterative process, on the other hand, is better suited to the management of small to medium scale projects. Over the past four months, I have worked with engineers at Johnson Space Center on a multitude of electronics projects. By describing the work I have done these last few months, analyzing the factors that have driven design decisions, and examining the testing and verification process, I will demonstrate that iterative design is the obvious choice for research and development projects.

  15. The PhOCoe Model--ergonomic pattern mapping in participatory design processes.

    PubMed

    Silva e Santos, Marcello

    2012-01-01

    The discipline and practice of human factors and ergonomics is quite rich in terms of the availability of analysis, development and evaluation tools and methods for its various processes. However, we lack effective instruments to either map or regulate comprehensively and effectively, cognitive and organizational related impacts, especially the environmental ones. Moreover, when ergonomic transformations through design - such as a new workstation design or even an entire new facility - is at play, ergonomics professionals tend to stay at bay, relying solely on design professionals and engineers. There is vast empirical evidence showing that participation of ergonomists as project facilitators, may contribute to an effective professional synergy amongst the various stakeholders in a multidisciplinary venue. When that happens, everyone wins - users and designers alike -because eventual conflicts, raised up in the midst of options selection, are dissipated in exchange for more convergent design alternatives. This paper presents a method for participatory design, in which users are encouraged to actively participate in the whole design process by sharing their real work activities with the design team. The negotiated results inferred from the ergonomic action and translated into a new design, are then compiled into a "Ergonomic Pattern Manual". This handbook of ergonomics-oriented design guidelines contains essential guidelines to be consulted in recurrent design project situations in which similar patterns might be used. The main drive is simple: nobody knows better than workers themselves what an adequate workplace design solution (equipment, workstation, office layout) should be.

  16. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    SciTech Connect

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant`s or licensee`s HSI design.

  17. Defining process design space for biotech products: case study of Pichia pastoris fermentation.

    PubMed

    Harms, Jean; Wang, Xiangyang; Kim, Tina; Yang, Xiaoming; Rathore, Anurag S

    2008-01-01

    The concept of "design space" has been proposed in the ICH Q8 guideline and is gaining momentum in its application in the biotech industry. It has been defined as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality." This paper presents a stepwise approach for defining process design space for a biologic product. A case study, involving P. pastoris fermentation, is presented to facilitate this. First, risk analysis via Failure Modes and Effects Analysis (FMEA) is performed to identify parameters for process characterization. Second, small-scale models are created and qualified prior to their use in these experimental studies. Third, studies are designed using Design of Experiments (DOE) in order for the data to be amenable for use in defining the process design space. Fourth, the studies are executed and the results analyzed for decisions on the criticality of the parameters as well as on establishing process design space. For the application under consideration, it is shown that the fermentation unit operation is very robust with a wide design space and no critical operating parameters. The approach presented here is not specific to the illustrated case study. It can be extended to other biotech unit operations and processes that can be scaled down and characterized at small scale.

  18. Holistic and Consistent Design Process for Hollow Structures Based on Braided Textiles and RTM

    NASA Astrophysics Data System (ADS)

    Gnädinger, Florian; Karcher, Michael; Henning, Frank; Middendorf, Peter

    2014-06-01

    The present paper elaborates a holistic and consistent design process for 2D braided composites in conjunction with Resin Transfer Moulding (RTM). These technologies allow a cost-effective production of composites due to their high degree of automation. Literature can be found that deals with specific tasks of the respective technologies but there is no work available that embraces the complete process chain. Therefore, an overall design process is developed within the present paper. It is based on a correlated conduction of sub-design processes for the braided preform, RTM-injection, mandrel plus mould and manufacturing. For each sub-process both, individual tasks and reasonable methods to accomplish them are presented. The information flow within the design process is specified and interdependences are illustrated. Composite designers will be equipped with an efficient set of tools because the respective methods regard the complexity of the part. The design process is applied for a demonstrator in a case study. The individual sub-design processes are accomplished exemplarily to judge about the feasibility of the presented work. For validation reasons, predicted braiding angles and fibre volume fractions are compared with measured ones and a filling and curing simulation based on PAM-RTM is checked against mould filling studies. Tool concepts for a RTM mould and mandrels that realise undercuts are tested. The individual process parameters for manufacturing are derived from previous design steps. Furthermore, the compatibility of the chosen fibre and matrix system is investigated based on pictures of a scanning electron microscope (SEM). The annual production volume of the demonstrator part is estimated based on these findings.

  19. H(infinity) PID controller design for runaway processes with time delay.

    PubMed

    Zhang, Weidong; Xu, Xiaoming

    2002-07-01

    This paper presents an efficient method for designing proportional-integra-derivative (PID) controllers for runaway processes with time delay. The method is developed based on the H(infinity) control theory in frequency domain. The constraints imposed by the internal stability and asymptotic properties of the closed-loop system are first investigated, a new procedure is then developed for analytically designing the controller, and simple design formulas are obtained. It is shown that the new controller can be designed to meet specified time domain performances. Typical design examples are provided to illustrate the proposed method.

  20. Optimal cure cycle design for autoclave processing of thick composites laminates: A feasibility study

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.

    1985-01-01

    The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.

  1. Overall challenges in incorporating micro-mechanical models into materials design process

    NASA Astrophysics Data System (ADS)

    Bennoura, M.; Aboutajeddine, A.

    2016-10-01

    Using materials in engineering design has historically been handled using the paradigm of selecting appropriate materials from the finite set of available material databases. Recent trends, however, have moved toward the tailoring of materials that meet the overall system performance requirements, based on a process called material design. An important building block of this process is micromechanical models that relate microstructure to proprieties. Unfortunately, these models remain short and include a lot of uncertainties from assumptions and idealizations, which, unavoidably, impacts material design strategy. In this work, candidate methods to deal with micromechanical models uncertainties and their drawbacks in material design are investigated. Robust design methods for quantifying uncertainty and managing or mitigating its impact on design performances are reviewed first. These methods include principles for classifying uncertainty, mathematical techniques for evaluating its level degree, and design methods for performing and generating design alternatives, that are relatively insensitive to sources of uncertainty and flexible for admitting design changes or variations. The last section of this paper addresses the limits of the existing approaches from material modelling perspective and identifies the research opportunities to overcome the impediment of incorporating micromechanical models in material design process.

  2. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  3. Sorption of distillery spent wash onto fly ash: kinetics, mechanism, process design and factorial design.

    PubMed

    Krishna Prasad, R; Srivastava, S N

    2009-01-30

    Batch and continuous experiments were performed for the sorption of distillery spent wash onto fly ash particles. The Freundlich and pseudo-second order equation were found to fit the equilibrium data perfectly. The Weber-Morris intraparticle diffusion isotherm equation was used to predict the sorption mechanism and the predicted equation for 10% dilution of spent wash sorption is q(t)=1.1344t(0.5)+33.304. The optimization using 2(3) factorial design of experiments provides optimal removal of color of 93% for dilution (5%), dosage of adsorbent (10g) and temperature (293K). The actual color removal at optimal conditions was 92.24%, confirms close to the factorial design results. The complete error analysis using six non-linear error functions: Chi-square (chi(2)); sum of square errors (SSE); composite fractional error function (HYBRD); derivative of Marquardt's percent standard deviation (MPSD); average relative error (ARE); sum of absolute errors (EABS) were calculated. Free energy of adsorption at 293K (DeltaG(0)=-1574.67J), enthalpy change (DeltaH(0)=-32.5487KJ) and entropy change (DeltaS(0)=105J/K) were calculated to predict the nature of adsorption. Adsorption studies in a packed column were evaluated using Bed depth service time model, Thomas model and Adams-Bohart model.

  4. Simulation and design of high precision unit processes via numerical methods

    NASA Astrophysics Data System (ADS)

    Stafford, Roger

    1988-08-01

    SDRC has developed new computer codes specifically tailored for precise and fist simulations of manufacturing processes. Critical aspects of unit processes involve nonlinear transient heat transfer coupled with slow creeping flow. Finite element methods are chosen. Numerical algorithms are adopted which are specifically suited to the problem. Key elements of these simulations are outlined. SDRC has integrated unit process simulations with CAD/CAM design systems, analysis graphics systems, automated inspection, and data base. An example will illustrate data flow, simulation results, and how engineers are using these tools to design new processes for large complex parts.

  5. Comprehensive design and process flow configuration for micro and nano tech devices

    NASA Astrophysics Data System (ADS)

    Hahn, Kai; Schmidt, Thilo; Mielke, Matthias; Ortloff, Dirk; Popp, Jens; Brück, Rainer

    2010-04-01

    The development of micro and nano tech devices based on semiconductor manufacturing processes comprises the structural design as well as the definition of the manufacturing process flow. The approach is characterized by application specific fabrication flows, i.e. fabrication processes (built up by a large variety of process steps and materials) depending on the later product. Technology constraints have a great impact on the device design and vice-versa. In this paper we introduce a comprehensive methodology and based on that an environment for customer-oriented product engineering of MEMS products. The development is currently carried out in an international multi-site research project.

  6. Conceptual designs and assessments of a coal gasification demonstration plant. Volume III. Texaco process

    SciTech Connect

    Not Available

    1980-10-01

    This volume contains detailed information on the conceptual design and assessment of the facility required to process approximately 20,000 tons per day of coal to produce medium Btu gas using the Texaco gasification process. The report includes process descriptions, flow diagrams and equipment lists for the various subsystems associated with the gasifiers along with descriptions of the overall facility. The facility is analyzed from both an economic and environmental standpoint. Problems of construction are addressed together with an overall design and construction schedule for the total facility. Resource requirements are summarized along with suggested development areas, both process and environmental.

  7. Conceptual designs and assessments of a coal gasification demonstration plant. Volume II. Koppers-Totzek process

    SciTech Connect

    Not Available

    1980-10-01

    This volume of the report contains detailed information on the conceptual design and assessment of the facility required to process approximately 20,000 tons per day of coal to produce medium Btu gas using the Koppers-Totzek gasification process. The report includes process descriptions, flow diagrams and equipment lists for the various subsystems associated with the gasifiers along with descriptions of the overall facility. The facility is analyzed from both an economic and environmental standpoint. Problems of construction are addressed together with an overall design and construction schedule for the total facility. Resource requirements are summarized along with suggested development areas, both process and environmental.

  8. Conceptual designs and assessments of a coal gasification demonstration plant. Volume IV. Babcock and Wilcox process

    SciTech Connect

    Not Available

    1980-10-01

    This volume of the report contains detailed information on the conceptual design and assessment of the facility required to process approximately 20,000 tons per day of coal to produce medium Btu gas using the Babcock and Wilcox gasification process. The report includes process descriptions, flow diagrams and equipment lists for the various subsystems associated with the gasifiers along with descriptions of the overall facility. The facility is analyzed from both an economic and environmental standpoint. Problems of construction are addressed together with an overall design and construction schedule for the total facility. Resource requirements are summarized along with suggested development areas, both process and environmental.

  9. Process design of a ball joint, considering caulking and pull-out strength.

    PubMed

    Sin, Bong-Su; Lee, Kwon-Hee

    2014-01-01

    A ball joint for an automobile steering system is a pivot component which is connected to knuckle and lower control arm. The manufacturing process for its caulking comprises spinning and deforming. In this study, the process was simulated by flexible multibody dynamics. The caulking was evaluated qualitatively through numerical analysis and inspecting a plastically deformed shape. The structural responses of a ball joint, namely, pull-out strength and stiffness, are commonly investigated in the development process. Thus, following the caulking analysis, the structural responses were considered. In addition, three design variables related to the manufacturing process were defined, and the effects of design variables with respect to pull-out strength, caulking depth, and maximum stress were obtained by introducing the DOE using an L9 orthogonal array. Finally, the optimum design maximizing the pull-out strength was suggested. For the final design, the caulking quality and the pull-out strength were investigated by making six samples and their tests.

  10. GREENER CHEMICAL PROCESS DESIGN ALTERNATIVES ARE REVEALED USING THE WASTE REDUCTION DECISION SUPPORT SYSTEM (WAR DSS)

    EPA Science Inventory

    The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...

  11. Automated systems for creative processes in scientific research, design, and robotics

    SciTech Connect

    Glushkov, V.M.; Stognii, A.A.; Biba, I.G.; Vashchenko, N.D.; Galagan, N.I.; Gladun, V.P.; Rabinovich, Z.L.; Sakunov, I.A.; Khomenko, L.V.

    1981-11-01

    The authors give a general description of software that was developed to automate the creative processes in scientific research, design and robotics. The systems APROS, SSP, Analizator-ES and Analizator are discussed. 12 references.

  12. EVALUATING THE ECONOMICS AND ENVIRONMENTAL FRIENDLINESS OF NEWLY DESIGNED OR RETROFITTED CHEMICAL PROCESSES

    EPA Science Inventory

    This work describes a method for using spreadsheet analyses of process designs and retrofits to provide simple and quick economic and environmental evaluations simultaneously. The method focuses attention onto those streams and components that have the largest monetary values and...

  13. Climbing The Knowledge Mountain - The New Solids Processing Design And Management Manual

    EPA Science Inventory

    The USEPA, Water Environment Federation (WEF) and Water Environment Research Foundation (WERF), under a Cooperative Research and Development Agreement (CRADA), are undertaking a massive effort to produce a Solids Processing Design and Management Manual (Manual). The Manual, repr...

  14. Climbing The Knowledge Mountain - The New Solids Processing Design And Management Manual (Presentation)

    EPA Science Inventory

    The USEPA, Water Environment Federation (WEF) and Water Environment Research Foundation (WERF), under a Cooperative Research and Development Agreement (CRADA), are undertaking a massive effort to produce a Solids Processing Design and Management Manual (Manual). The Manual, repr...

  15. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  16. Media milling process optimization for manufacture of drug nanoparticles using design of experiments (DOE).

    PubMed

    Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj

    2015-01-01

    Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.

  17. Collaborative Design Processes: An Active and Reflective Learning Course in Multidisciplinary Collaboration.

    ERIC Educational Resources Information Center

    O'Brien, William J.; Soibelman, Lucio; Elvin, George

    2003-01-01

    In a capstone course, graduate students from two universities participated in collaborative design in the architectural, engineering, and construction industries in multidisciplinary teams via the Internet. Students also developed process designs to integrate technology into multidisciplinary teamwork, combining active and reflective learning.…

  18. Learning Effects of a Science Textbook Designed with Adapted Cognitive Process Principles on Grade 5 Students

    ERIC Educational Resources Information Center

    Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho

    2015-01-01

    This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…

  19. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes…

  20. [Establishment of design space for production process of traditional Chinese medicine preparation].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Qiao, Yan-Jiang; Wu, Zhi-Sheng; Lin, Zhao-Zhou

    2013-03-01

    The philosophy of quality by design (QbD) is now leading the changes in the drug manufacturing mode from the conventional test-based approach to the science and risk based approach focusing on the detailed research and understanding of the production process. Along with the constant deepening of the understanding of the manufacturing process, the design space will be determined, and the emphasis of quality control will be shifted from the quality standards to the design space. Therefore, the establishment of the design space is core step in the implementation of QbD, and it is of great importance to study the methods for building the design space. This essay proposes the concept of design space for the production process of traditional Chinese medicine (TCM) preparations, gives a systematic introduction of the concept of the design space, analyzes the feasibility and significance to build the design space in the production process of traditional Chinese medicine preparations, and proposes study approaches on the basis of examples that comply with the characteristics of traditional Chinese medicine preparations, as well as future study orientations.