Science.gov

Sample records for reload design process

  1. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  2. Modeling and design of a reload PWR core for a 48-month fuel cycle

    SciTech Connect

    McMahon, M.V.; Driscoll, M.J.; Todreas, N.E.

    1997-05-01

    The objective of this research was to use state-of-the-art nuclear and fuel performance packages to evaluate the feasibility and costs of a 48 calendar month core in existing pressurized water reactor (PWR) designs, considering the full range of practical design and economic considerations. The driving force behind this research is the desire to make nuclear power more economically competitive with fossil fuel options by expanding the scope for achievement of higher capacity factors. Using CASMO/SIMULATE, a core design with fuel enriched to 7{sup w}/{sub o} U{sup 235} for a single batch loaded, 48-month fuel cycle has been developed. This core achieves an ultra-long cycle length without exceeding current fuel burnup limits. The design uses two different types of burnable poisons. Gadolinium in the form of gadolinium oxide (Gd{sub 2}O{sub 3}) mixed with the UO{sub 2} of selected pins is sued to hold down initial reactivity and to control flux peaking throughout the life of the core. A zirconium di-boride (ZrB{sub 2}) integral fuel burnable absorber (IFBA) coating on the Gd{sub 2}O{sub 3}-UO{sub 2} fuel pellets is added to reduce the critical soluble boron concentration in the reactor coolant to within acceptable limits. Fuel performance issues of concern to this design are also outlined and areas which will require further research are highlighted.

  3. From Reload to ReCourse: Learning from IMS Learning Design Implementations

    ERIC Educational Resources Information Center

    Griffiths, David; Beauvoir, Phillip; Liber, Oleg; Barrett-Baxendale, Mark

    2009-01-01

    The use of the Web to deliver open, distance, and flexible learning has opened up the potential for social interaction and adaptive learning, but the usability, expressivity, and interoperability of the available tools leave much to be desired. This article explores these issues as they relate to teachers and learning designers through the case of…

  4. Whorf Reloaded: Language Effects on Nonverbal Number Processing in First Grade--A Trilingual Study

    ERIC Educational Resources Information Center

    Pixner, S.; Moeller, K.; Hermanova, V.; Nuerk, H. -C.; Kaufmann, L.

    2011-01-01

    The unit-decade compatibility effect is interpreted to reflect processes of place value integration in two-digit number magnitude comparisons. The current study aimed at elucidating the influence of language properties on the compatibility effect of Arabic two-digit numbers in Austrian, Italian, and Czech first graders. The number word systems of…

  5. Optimal reload strategies for identify-and-destroy missions

    NASA Astrophysics Data System (ADS)

    Hyland, John C.; Smith, Cheryl M.

    2004-09-01

    In this problem an identification vehicle must re-acquire a fixed set of suspected targets and determine whether each suspected target is a mine or a false alarm. If a target is determined to be a mine, the identification vehicle must neutralize it by either delivering one of a limited number of on-board bombs or by assigning the neutralization task to one of a limited number of single-shot suicide vehicles. The identification vehicle has the option to reload. The singleshot suicide vehicles, however, cannot be replenished. We have developed an optimal path planning and reload strategy for this identify and destroy mission that takes into account the probabilities that suspected targets are mines, the costs to move between targets, the costs to return to and from the reload point, and the cost to reload. The mission is modeled as a discrete multi-dimensional Markov process. At each target position the vehicle decides based on the known costs, probabilities, the number of bombs on board (r), and the number of remaining one-shot vehicles (s) whether to move directly on to the next target or to reload before continuing and whether to destroy any mine with an on-board bomb or a one-shot suicide vehicle. The approach recursively calculates the minimum expected overall cost conditioned on all possible values r and s. The recursion is similar to dynamic programming in that it starts at the last suspected target location and works its way backwards to the starting point. The approach also uses a suboptimal traveling salesman strategy to search over candidate deployment locations to calculate the best initial deployment point where the reloads will take place.

  6. NASA reload program

    NASA Technical Reports Server (NTRS)

    Byington, Marshall

    1993-01-01

    Atlantic Research Corporation (ARC) contracted with NASA to manufacture and deliver thirteen small scale Solid Rocket Motors (SRM). These motors, containing five distinct propellant formulations, will be used for plume induced radiation studies. The information contained herein summarizes and documents the program accomplishments and results. Several modifications were made to the scope of work during the course of the program. The effort was on hold from late 1991 through August, 1992 while propellant formulation changes were developed. Modifications to the baseline program were completed in late-August and Modification No. 6 was received by ARC on September 14, 1992. The modifications include changes to the propellant formulation and the nozzle design. The required motor deliveries were completed in late-December, 1992. However, ARC agreed to perform an additional mix and cast effort at no cost to NASA and another motor was delivered in March, 1993.

  7. The Heliogyro Reloaded

    NASA Technical Reports Server (NTRS)

    Wilkie, William K.; Warren, Jerry E.; Thompson, M. W.; Lisman, P. D.; Walkemeyer, P. E.; Guerrant, D. V.; Lawrence, D. A.

    2011-01-01

    The heliogyro is a high-performance, spinning solar sail architecture that uses long - order of kilometers - reflective membrane strips to produce thrust from solar radiation pressure. The heliogyro s membrane blades spin about a central hub and are stiffened by centrifugal forces only, making the design exceedingly light weight. Blades are also stowed and deployed from rolls; eliminating deployment and packaging problems associated with handling extremely large, and delicate, membrane sheets used with most traditional square-rigged or spinning disk solar sail designs. The heliogyro solar sail concept was first advanced in the 1960s by MacNeal. A 15 km diameter version was later extensively studied in the 1970s by JPL for an ambitious Comet Halley rendezvous mission, but ultimately not selected due to the need for a risk-reduction flight demonstration. Demonstrating system-level feasibility of a large, spinning heliogyro solar sail on the ground is impossible; however, recent advances in microsatellite bus technologies, coupled with the successful flight demonstration of reflectance control technologies on the JAXA IKAROS solar sail, now make an affordable, small-scale heliogyro technology flight demonstration potentially feasible. In this paper, we will present an overview of the history of the heliogyro solar sail concept, with particular attention paid to the MIT 200-meter-diameter heliogyro study of 1989, followed by a description of our updated, low-cost, heliogyro flight demonstration concept. Our preliminary heliogyro concept (HELIOS) should be capable of demonstrating an order-of-magnitude characteristic acceleration performance improvement over existing solar sail demonstrators (HELIOS target: 0.5 to 1.0 mm/s2 at 1.0 AU); placing the heliogyro technology in the range required to enable a variety of science and human exploration relevant support missions.

  8. Hybrid expert system implementation to determine core reload patterns

    SciTech Connect

    Greek, K.J.; Robinson, A.H.

    1989-01-01

    Determining reactor reload fuel patterns is a computationally intensive problem solving process for which automation can be of significant benefit. Often much effort is expended in the search for an optimal loading. While any modern programming language could be used to automate solution, the specialized tools of artificial intelligence (AI) are the most efficient means of introducing the fuel management expert's knowledge into the search for an optimum reload pattern. Prior research in pressurized water reactor refueling strategies developed FORTRAN programs that automated an expert's basic knowledge to direct a search for an acceptable minimum peak power loading. The dissatisfaction with maintenance of compiled knowledge in FORTRAN programs has served as the motivation for the development of the SHUFFLE expert system. SHUFFLE is written in Smalltalk, an object-oriented programming language, and evaluates loadings as it generates them using a two-group, two-dimensional nodal power calculation compiled in a personal computer-based FORTRAN. This paper reviews the object-oriented representation developed to solve the core reload problem with an expert system tool and its operating prototype, SHUFFLE.

  9. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  10. Reloading Experiment for Aluminum at High Pressure

    NASA Astrophysics Data System (ADS)

    Rili, Hou; Jianxiang, Peng; Jianhua, Zhang; Mingwu, Tu; Ping, Zhou

    2009-06-01

    In the traditional AC method to measure material's dynamic strength, the combination flyer is easy to be delaminated due to shock waves produced in the projectile as a result of sudden application of the projectile driving pressure, which always result in the failure of reloading experiment. The maximum reshock experimental pressure for aluminum presented by Huang and Asay in 2005 is only 22GPa. A technique is described for reloading experiment, by which reloading experiments were performed for 2A12 aluminum alloy shocked to 67.6GPa. In our experiment, the oxygen-free copper and TC4 titanium alloy impactors were used with ultrapure LiF interferometer windows, 2A12 aluminum alloy samples were baked by PMMA buffers, and VISAR was used to measure interface particle velocity. Using an approximate double-step-sample method (two shots with different sample thickness at the same impact velocity), the Lagrange longitudinal velocities along reloading path from initial shock state were obtained, and coupled with unloading experimental data, the bulk velocities were determined, as well as the dynamic yield strength of 2A12 aluminum alloy.

  11. Bone Mineral Density of the Tarsals and Metatarsals With Reloading

    PubMed Central

    Hastings, Mary Kent; Gelber, Judy; Commean, Paul K; Prior, Fred; Sinacore, David R

    2008-01-01

    Background and Purpose: Bone mineral density (BMD) decreases rapidly with prolonged non–weight bearing. Maximizing the BMD response to reloading activities after NWB is critical to minimizing fracture risk. Methods for measuring individual tarsal and metatarsal BMD have not been available. This case report describes tarsal and metatarsal BMD with a reloading program, as revealed by quantitative computed tomography (QCT). Case Description: A 24-year-old woman was non–weight bearing for 6 weeks after right talocrural arthroscopy. Tarsal and metatarsal BMD were measured with QCT 9 weeks (before reloading) and 32 weeks (after reloading) after surgery. A 26-week progressive reloading program was completed. Change scores were calculated for BMD before reloading and BMD after reloading for the total foot (average of all tarsals and metatarsals), tarsals, metatarsals, bones of the medial column (calcaneus, navicular, cuneiforms 1 and 2, and metatarsal 1), and bones of the lateral column (calcaneus, cuboid, cuneiform 3, and metatarsals 2–5). The percent differences in BMD between the involved side and the uninvolved side were calculated. Outcomes: Before reloading, BMD of the involved total foot was 9% lower than that on the uninvolved side. After reloading, BMD increased 22% and 21% for the total foot, 16% and 14% for the tarsals, 29% and 30% for the metatarsals, 14% and 15% for the medial column bones, and 28% and 26% for the lateral column bones on the involved and uninvolved sides, respectively. After reloading, BMD of the involved total foot remained 8% lower than that on the uninvolved side. Discussion: The increase in BMD with reloading was not uniform across all pedal bones; the metatarsals showed a greater increase than the tarsals, and the lateral column bones showed a greater increase than the medial column bones. PMID:18388153

  12. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  13. A Process for Design Engineering

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2004-01-01

    The American Institute of Aeronautics and Astronautics Design Engineering Technical Committee has developed a draft Design Engineering Process with the participation of the technical community. This paper reviews similar engineering activities, lays out common terms for the life cycle and proposes a Design Engineering Process.

  14. Ethylene process design optimization

    SciTech Connect

    2001-09-01

    Integration of Advanced Technologies will Update Ethylene Plants. Nearly 93 million tons of ethylene are produced annually in chemical plants worldwide, using an energy intensive process that consumes 2.5 quadrillion Btu per year.

  15. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  16. Book Processing Facility Design.

    ERIC Educational Resources Information Center

    Sheahan (Drake)-Stewart Dougall, Marketing and Physical Distribution Consultants, New York, NY.

    The Association of New York Libraries for Technical Services (ANYLTS) is established to develop and run a centralized book processing facility for the public library systems in New York State. ANYLTS plans to receive book orders from the 22 library systems, transmit orders to publishers, receive the volumes from the publishers, print and attach…

  17. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  18. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  19. Optimization process in helicopter design

    NASA Technical Reports Server (NTRS)

    Logan, A. H.; Banerjee, D.

    1984-01-01

    In optimizing a helicopter configuration, Hughes Helicopters uses a program called Computer Aided Sizing of Helicopters (CASH), written and updated over the past ten years, and used as an important part of the preliminary design process of the AH-64. First, measures of effectiveness must be supplied to define the mission characteristics of the helicopter to be designed. Then CASH allows the designer to rapidly and automatically develop the basic size of the helicopter (or other rotorcraft) for the given mission. This enables the designer and management to assess the various tradeoffs and to quickly determine the optimum configuration.

  20. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  1. Hydroforming design and process advisor

    SciTech Connect

    Greer, J.T.; Ni, C.M.

    1996-10-10

    The hydroforming process involves hydraulically forming components by conforming them to the inner contours of a die. These contours can be complex and can often cause the material being formed to be stressed to rupture. Considerable process knowledge and materials modeling expertise is required to design hydroform dies and hydroformed parts that are readily formed without being overly stressed. For this CRADA, materials properties for steel tubes subjected to hydraulic stresses were collected; algorithms were developed which combined the materials properties data with process knowledge; and a user friendly graphical interface was utilized to make the system usable by a design engineer. A prototype hydroforming advisor was completed and delivered to GM. The technical objectives of the CRADA were met allowing for the development of an intelligent design systems, prediction of forming properties related to hydroforming, simulation and modeling of process execution, and design optimization. The design advisor allows a rapid and seamless approach to integration an otherwise enormous and onerous task of analysis and evaluation.

  2. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  3. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  4. The maintenance of sex in bacteria is ensured by its potential to reload genes.

    PubMed

    Szöllosi, Gergely J; Derényi, Imre; Vellai, Tibor

    2006-12-01

    Why sex is maintained in nature is a fundamental question in biology. Natural genetic transformation (NGT) is a sexual process by which bacteria actively take up exogenous DNA and use it to replace homologous chromosomal sequences. As it has been demonstrated, the role of NGT in repairing deleterious mutations under constant selection is insufficient for its survival, and the lack of other viable explanations have left no alternative except that DNA uptake provides nucleotides for food. Here we develop a novel simulation approach for the long-term dynamics of genome organization (involving the loss and acquisition of genes) in a bacterial species consisting of a large number of spatially distinct populations subject to independently fluctuating ecological conditions. Our results show that in the presence of weak interpopulation migration NGT is able to subsist as a mechanism to reload locally lost, intermittently selected genes from the collective gene pool of the species through DNA uptake from migrants. Reloading genes and combining them with those in locally adapted genomes allow individual cells to readapt faster to environmental changes. The machinery of transformation survives under a wide range of model parameters readily encompassing real-world biological conditions. These findings imply that the primary role of NGT is not to serve the cell with food, but to provide homologous sequences for restoring genes that have disappeared from or become degraded in the local population. PMID:17028325

  5. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  6. Regulation of proteolysis during reloading of the unweighted soleus muscle.

    PubMed

    Taillandier, Daniel; Aurousseau, Eveline; Combaret, Lydie; Guezennec, Charles-Yannick; Attaix, Didier

    2003-05-01

    There is little information on the mechanisms responsible for muscle recovery following a catabolic condition. To address this point, we reloaded unweighted animals and investigated protein turnover during recovery from this highly catabolic state and the role of proteolysis in the reorganization of the soleus muscle. During early recovery (18 h of reloading) both muscle protein synthesis and breakdown were elevated (+65%, P<0.001 and +22%, P<0.05, respectively). However, only the activation of non-lysosomal and Ca(2+)-independent proteolysis was responsible for increased protein breakdown. Accordingly, mRNA levels for ubiquitin and 20S proteasome subunits C8 and C9 were markedly elevated (from +89 to +325%, P<0.03) and actively transcribed as shown by the analysis of polyribosomal profiles. In contrast, both cathepsin D and 14-kDa-ubiquitin conjugating enzyme E2 mRNA levels decreased, suggesting that the expression of such genes is an early marker of reversed muscle wasting. Following 7 days of reloading, protein synthesis was still elevated and there was no detectable change in protein breakdown rates. Accordingly, mRNA levels for all the proteolytic components tested were back to control values even though an accumulation of high molecular weight ubiquitin conjugates was still detectable. This suggests that soleus muscle remodeling was still going on. Taken together, our observations suggest that enhanced protein synthesis and breakdown are both necessary to recover from muscle atrophy and result in catch-up growth. The observed non-coordinate regulation of proteolytic systems is presumably required to target specific classes of substrates (atrophy-specific protein isoforms, damaged proteins) for replacement and/or elimination. PMID:12672458

  7. Myocardial Reloading after Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    SciTech Connect

    Kajimoto, Masaki; Priddy, Colleen M.; Ledee, Dolena; Xu, Chun; Isern, Nancy G.; Olson, Aaron; Des Rosiers, Christine; Portman, Michael A.

    2013-08-19

    Extracorporeal membrane oxygenation (ECMO) unloads the heart providing a bridge to recovery in children after myocardial stunning. Mortality after ECMO remains high.Cardiac substrate and amino acid requirements upon weaning are unknown and may impact recovery. We assessed the hypothesis that ventricular reloading modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Fourteen immature piglets (7.8-15.6 kg) were separated into 2 groups based on ventricular loading status: 8 hour-ECMO (UNLOAD) and post-wean from ECMO (RELOAD). We infused [2-13C]-pyruvate as an oxidative substrate and [13C6]-L-leucine, as a tracer of amino acid oxidation and protein synthesis into the coronary artery. RELOAD showed marked elevations in myocardial oxygen consumption above baseline and UNLOAD. Pyruvate uptake was markedly increased though RELOAD decreased pyruvate contribution to oxidative CAC metabolism.RELOAD also increased absolute concentrations of all CAC intermediates, while maintaining or increasing 13C-molar percent enrichment. RELOAD also significantly increased cardiac fractional protein synthesis rates by >70% over UNLOAD. Conclusions: RELOAD produced high energy metabolic requirement and rebound protein synthesis. Relative pyruvate decarboxylation decreased with RELOAD while promoting anaplerotic pyruvate carboxylation and amino acid incorporation into protein rather than to the CAC for oxidation. These perturbations may serve as therapeutic targets to improve contractile function after ECMO.

  8. Reloading Continuous GPS in Northwest Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez-Garcia, J. J.; Suarez-Vidal, F.; Gonzalez-Ortega, J. A.

    2007-05-01

    For more than 10 years we try to follow the steps of the Southern California Integrated GPS Network (SCIGN) and the Plate Boundary Observatory (PBO) in USA, this gives us the opportunity to be in position to contribute to develop a modern GPS Network in Mexico. During 1998 and 2001, three stations were deployed in Northwest Mexico in concert with the development of SCIGN: SPMX in north central Baja California state at the National Astronomical Observatory, UNAM in the Sierra San Pedro Martir; CORX in Isla Coronados Sur, offshore San Diego, Ca./Tijuana, Mexico and GUAX in Guadalupe island 150 miles offshore Baja California peninsula, which provide a unique site on the Pacific plate in the Northamerica/Pacific boundary zone in Las Californias. The former IGS station in CICESE, Ensenada, CICE installed in 1995, was replaced by CIC1 in 1999. In 2004 and 2005 with partial support from SCIGN and UNAVCO to University of Arizona a volunteer team from UNAVCO, Caltech, U.S. Geological Survey, Universidad de la Sierra at Moctezuma Sonora and CICESE built two new shallow-braced GPS sites in northwest Mexico. The first site USMX is located at east-central Sonora and the second YESX is located high in the Sierra Madre Occidental at Yecora near the southern border of Sonora and Chihuahua. All data is openly available at SOPAC and/or UNAVCO. The existing information has been valuable to resolve the "total" plate motion between the Pacific plate (GUAX) and the Northamerica plate (USMX and YESX) in the north- central Gulf of California. Since the last year we have the capability of GPS data processing using GAMIT/GLOBK, and after gain some practice with survey mode data processing we can convert us in a GPS processing center in Mexico. Currently only 2 sites are operational: CIC1 and USMX. With new energy we are ready to contribute to the establishment of a modern GPS network in Mexico for science, hazard monitoring and infrastructure.

  9. Digital Earth reloaded - Beyond the next generation

    NASA Astrophysics Data System (ADS)

    Ehlers, M.; Woodgate, P.; Annoni, A.; Schade, S.

    2014-02-01

    Digital replicas (or 'mirror worlds') of complex entities and systems are now routine in many fields such as aerospace engineering; archaeology; medicine; or even fashion design. The Digital Earth (DE) concept as a digital replica of the entire planet occurs in Al Gore's 1992 book Earth in the Balance and was popularized in his speech at the California Science Center in January 1998. It played a pivotal role in stimulating the development of a first generation of virtual globes, typified by Google Earth that achieved many elements of this vision. Almost 15 years after Al Gore's speech, the concept of DE needs to be re-evaluated in the light of the many scientific and technical developments in the fields of information technology, data infrastructures, citizen?s participation, and earth observation that have taken place since. This paper intends to look beyond the next generation predominantly based on the developments of fields outside the spatial sciences, where concepts, software, and hardware with strong relationships to DE are being developed without referring to this term. It also presents a number of guiding criteria for future DE developments.

  10. The Snark was a Boojum - reloaded

    PubMed Central

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 (“The Snark was a Boojum”). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of “translational research” that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that

  11. The Snark was a Boojum - reloaded.

    PubMed

    Macrì, Simone; Richter, S Helene

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 ("The Snark was a Boojum"). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of "translational research" that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that developing

  12. GAX absorption cycle design process

    SciTech Connect

    Priedeman, D.K.; Christensen, R.N.

    1999-07-01

    This paper presents an absorption system design process that relies on computer simulations that are validated by experimental findings. An ammonia-water absorption heat pump cycle at 3 refrigeration tons (RT) and chillers at 3.3 RT and 5 RT (10.5 kW, 11.6 kW, and 17.6 kW) were initially modeled and then built and tested. The experimental results were used to calibrate both the cycle simulation and the component simulations, yielding computer design routines that could accurately predict component and cycle performance. Each system was a generator-absorber heat exchange (GAX) cycle, and all were sized for residential and light commercial use, where very little absorption equipment is currently used. The specific findings of the 5 RT (17.6 kW) chiller are presented. Modeling incorporated a heat loss from the gas-fired generator and pressure drops in both the evaporator and absorber. Simulation results and experimental findings agreed closely and validated the modeling method and simulation software.

  13. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  14. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  15. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  16. Development of Innovative Design Processor

    SciTech Connect

    Park, Y.S.; Park, C.O.

    2004-07-01

    The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which is another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)

  17. Improvement of characteristic statistic algorithm and its application on equilibrium cycle reloading optimization

    SciTech Connect

    Hu, Y.; Liu, Z.; Shi, X.; Wang, B.

    2006-07-01

    A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)

  18. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  19. 76 FR 70368 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... USDA Secretarial disaster designation process. FSA proposes to simplify the processes and delegate them... rule would update the language to reflect current practice. The current regulations require that a... proposes to simplify the USDA Secretarial designation process from a six-step process to a two-step...

  20. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  1. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  2. Instructional Design Processes and Traditional Colleges

    ERIC Educational Resources Information Center

    Vasser, Nichole

    2010-01-01

    Traditional colleges who have implemented distance education programs would benefit from using instructional design processes to develop their courses. Instructional design processes provide the framework for designing and delivering quality online learning programs in a highly-competitive educational market. Traditional college leaders play a…

  3. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... designation regulations to provide for changes in the designation process (76 FR 70368-70374). In general, that rule proposed to simplify the disaster designation process and to delegate the authority for... 759.6 has also been changed from the proposed rule to remove proposed language referring to a...

  4. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  5. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  6. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  7. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  8. NANEX: Process design and optimization.

    PubMed

    Baumgartner, Ramona; Matić, Josip; Schrank, Simone; Laske, Stephan; Khinast, Johannes; Roblegg, Eva

    2016-06-15

    Previously, we introduced a one-step nano-extrusion (NANEX) process for transferring aqueous nano-suspensions into solid formulations directly in the liquid phase. Nano-suspensions were fed into molten polymers via a side-feeding device and excess water was eliminated via devolatilization. However, the drug content in nano-suspensions is restricted to 30 % (w/w), and obtaining sufficiently high drug loadings in the final formulation requires the processing of high water amounts and thus a fundamental process understanding. To this end, we investigated four polymers with different physicochemical characteristics (Kollidon(®) VA64, Eudragit(®) E PO, HPMCAS and PEG 20000) in terms of their maximum water uptake/removal capacity. Process parameters as throughput and screw speed were adapted and their effect on the mean residence time and filling degree was studied. Additionally, one-dimensional discretization modeling was performed to examine the complex interactions between the screw geometry and the process parameters during water addition/removal. It was established that polymers with a certain water miscibility/solubility can be manufactured via NANEX. Long residence times of the molten polymer in the extruder and low filling degrees in the degassing zone favored the addition/removal of significant amounts of water. The residual moisture content in the final extrudates was comparable to that of extrudates manufactured without water. PMID:27090153

  9. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  10. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  11. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  12. Reloading partly recovers bone mineral density and mechanical properties in hind limb unloaded rats

    NASA Astrophysics Data System (ADS)

    Zhao, Fan; Li, Dijie; Arfat, Yasir; Chen, Zhihao; Liu, Zonglin; Lin, Yu; Ding, Chong; Sun, Yulong; Hu, Lifang; Shang, Peng; Qian, Airong

    2014-12-01

    Skeletal unloading results in decreased bone formation and bone mass. During long-term space flight, the decreased bone mass is impossible to fully recover. Therefore, it is necessary to develop the effective countermeasures to prevent spaceflight-induced bone loss. Hindlimb Unloading (HLU) simulates effects of weightlessness and is utilized extensively to examine the response of musculoskeletal systems to certain aspects of space flight. The purpose of this study is to investigate the effects of a 4-week HLU in rats and subsequent reloading on the bone mineral density (BMD) and mechanical properties of load-bearing bones. After HLU for 4 weeks, the rats were then subjected to reloading for 1 week, 2 weeks and 3 weeks, and then the BMD of the femur, tibia and lumbar spine in rats were assessed by dual energy X-ray absorptiometry (DXA) every week. The mechanical properties of the femur were determined by three-point bending test. Dry bone and bone ash of femur were obtained through Oven-Drying method and were weighed respectively. Serum alkaline phosphatase (ALP) and serum calcium were examined through ELISA and Atomic Absorption Spectrometry. The results showed that 4 weeks of HLU significantly decreased body weight of rats and reloading for 1 week, 2 weeks or 3 weeks did not recover the weight loss induced by HLU. However, after 2 weeks of reloading, BMD of femur and tibia of HLU rats partly recovered (+10.4%, +2.3%). After 3 weeks of reloading, the reduction of BMD, energy absorption, bone mass and mechanical properties of bone induced by HLU recovered to some extent. The changes in serum ALP and serum calcium induced by HLU were also recovered after reloading. Our results indicate that a short period of reloading could not completely recover bone after a period of unloading, thus some interventions such as mechanical vibration or pharmaceuticals are necessary to help bone recovery.

  13. Muscle regeneration during hindlimb unloading results in a reduction in muscle size after reloading

    NASA Technical Reports Server (NTRS)

    Mozdziak, P. E.; Pulvermacher, P. M.; Schultz, E.

    2001-01-01

    The hindlimb-unloading model was used to study the ability of muscle injured in a weightless environment to recover after reloading. Satellite cell mitotic activity and DNA unit size were determined in injured and intact soleus muscles from hindlimb-unloaded and age-matched weight-bearing rats at the conclusion of 28 days of hindlimb unloading, 2 wk after reloading, and 9 wk after reloading. The body weights of hindlimb-unloaded rats were significantly (P < 0.05) less than those of weight-bearing rats at the conclusion of hindlimb unloading, but they were the same (P > 0.05) as those of weight-bearing rats 2 and 9 wk after reloading. The soleus muscle weight, soleus muscle weight-to-body weight ratio, myofiber diameter, number of nuclei per millimeter, and DNA unit size were significantly (P < 0.05) smaller for the injured soleus muscles from hindlimb-unloaded rats than for the soleus muscles from weight-bearing rats at each recovery time. Satellite cell mitotic activity was significantly (P < 0.05) higher in the injured soleus muscles from hindlimb-unloaded rats than from weight-bearing rats 2 wk after reloading, but it was the same (P > 0.05) as in the injured soleus muscles from weight-bearing rats 9 wk after reloading. The injured soleus muscles from hindlimb-unloaded rats failed to achieve weight-bearing muscle size 9 wk after reloading, because incomplete compensation for the decrease in myonuclear accretion and DNA unit size expansion occurred during the unloading period.

  14. WORKSHOP ON ENVIRONMENTALLY CONSCIOUS CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    To encourage the consideration of environmental issues during chemical process design, the USEPA has developed techniques and software tools to evaluate the relative environmental impact of a chemical process. These techniques and tools aid in the risk management process by focus...

  15. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  16. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  17. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  18. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  19. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  20. Distributed Group Design Process: Lessons Learned.

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ganesan, Radha

    A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…

  1. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  2. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  3. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  4. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  5. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  6. Temporal changes in sarcomere lesions of rat adductor longus muscles during hindlimb reloading

    NASA Technical Reports Server (NTRS)

    Krippendorf, B. B.; Riley, D. A.

    1994-01-01

    Focal sarcomere disruptions were previously observed in adductor longus muscles of rats flown approximately two weeks aboard the Cosmos 1887 and 2044 biosatellite flights. These lesions, characterized by breakage and loss of myofilaments and Z-line streaming, resembled damage induced by unaccustomed exercise that includes eccentric contractions in which muscles lengthen as they develop tension. We hypothesized that sarcomere lesions in atrophied muscles of space flow rats were not produced in microgravity by muscle unloading but resulted from muscle reloading upon re-exposure to terrestrial gravity. To test this hypothesis, we examined temporal changes in sarcomere integrity of adductor longus muscles from rats subjected to 12.5 days of hindlimb suspension unloading and subsequent reloading by return to vivarium cages for 0, 6, 12, or 48 hours of normal weightbearing. Our ultrastructural observations suggested that muscle unloading (0 h reloading) induced myofibril misalignment associated with myofiber atrophy. Muscle reloading for 6 hours induced focal sarcomere lesions in which cross striations were abnormally widened. Such lesions were electron lucent due to extensive myofilament loss. Lesions in reloaded muscles showed rapid restructuring. By 12 hours of reloading, lesions were moderately stained foci and by 48 hours darkly stained foci in which the pattern of cross striations was indistinct at the light and electron microscopic levels. These lesions were spanned by Z-line-like electron dense filamentous material. Our findings suggest a new role for Z-line streaming in lesion restructuring: rather than an antecedent to damage, this type of Z-line streaming may be indicative of rapid, early sarcomere repair.

  7. Myocardial Reloading After Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    PubMed Central

    Kajimoto, Masaki; O'Kelly Priddy, Colleen M.; Ledee, Dolena R.; Xu, Chun; Isern, Nancy; Olson, Aaron K.; Rosiers, Christine Des; Portman, Michael A.

    2013-01-01

    Background Extracorporeal membrane oxygenation (ECMO) unloads the heart, providing a bridge to recovery in children after myocardial stunning. ECMO also induces stress which can adversely affect the ability to reload or wean the heart from the circuit. Metabolic impairments induced by altered loading and/or stress conditions may impact weaning. However, cardiac substrate and amino acid requirements upon weaning are unknown. We assessed the hypothesis that ventricular reloading with ECMO modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Methods and Results Sixteen immature piglets (7.8 to 15.6 kg) were separated into 2 groups based on ventricular loading status: 8‐hour ECMO (UNLOAD) and postwean from ECMO (RELOAD). We infused into the coronary artery [2‐13C]‐pyruvate as an oxidative substrate and [13C6]‐L‐leucine as an indicator for amino acid oxidation and protein synthesis. Upon RELOAD, each functional parameter, which were decreased substantially by ECMO, recovered to near‐baseline level with the exclusion of minimum dP/dt. Accordingly, myocardial oxygen consumption was also increased, indicating that overall mitochondrial metabolism was reestablished. At the metabolic level, when compared to UNLOAD, RELOAD altered the contribution of various substrates/pathways to tissue pyruvate formation, favoring exogenous pyruvate versus glycolysis, and acetyl‐CoA formation, shifting away from pyruvate decarboxylation to endogenous substrate, presumably fatty acids. Furthermore, there was also a significant increase of tissue concentrations for all CAC intermediates (≈80%), suggesting enhanced anaplerosis, and of fractional protein synthesis rates (>70%). Conclusions RELOAD alters both cytosolic and mitochondrial energy substrate metabolism, while favoring leucine incorporation into protein synthesis rather than oxidation in the CAC. Improved understanding of factors governing these metabolic perturbations may

  8. Intracellular Ca2+ transients in mouse soleus muscle after hindlimb unloading and reloading

    NASA Technical Reports Server (NTRS)

    Ingalls, C. P.; Warren, G. L.; Armstrong, R. B.; Hamilton, S. L. (Principal Investigator)

    1999-01-01

    The objective of this study was to determine whether altered intracellular Ca(2+) handling contributes to the specific force loss in the soleus muscle after unloading and/or subsequent reloading of mouse hindlimbs. Three groups of female ICR mice were studied: 1) unloaded mice (n = 11) that were hindlimb suspended for 14 days, 2) reloaded mice (n = 10) that were returned to their cages for 1 day after 14 days of hindlimb suspension, and 3) control mice (n = 10) that had normal cage activity. Maximum isometric tetanic force (P(o)) was determined in the soleus muscle from the left hindlimb, and resting free cytosolic Ca(2+) concentration ([Ca(2+)](i)), tetanic [Ca(2+)](i), and 4-chloro-m-cresol-induced [Ca(2+)](i) were measured in the contralateral soleus muscle by confocal laser scanning microscopy. Unloading and reloading increased resting [Ca(2+)](i) above control by 36% and 24%, respectively. Although unloading reduced P(o) and specific force by 58% and 24%, respectively, compared with control mice, there was no difference in tetanic [Ca(2+)](i). P(o), specific force, and tetanic [Ca(2+)](i) were reduced by 58%, 23%, and 23%, respectively, in the reloaded animals compared with control mice; however, tetanic [Ca(2+)](i) was not different between unloaded and reloaded mice. These data indicate that although hindlimb suspension results in disturbed intracellular Ca(2+) homeostasis, changes in tetanic [Ca(2+)](i) do not contribute to force deficits. Compared with unloading, 24 h of physiological reloading in the mouse do not result in further changes in maximal strength or tetanic [Ca(2+)](i).

  9. Macrocell design for concurrent signal processing

    SciTech Connect

    Pope, S.P.; Brodersen, R.W.

    1983-01-01

    Macrocells serve as subsystems at the top level of the hardware design hierarchy. The authors present the macrocell design technique as applied to the implementation of real-time, sampled-data signal processing functions. The design of such circuits is particularly challenging due to the computationally intensive nature of signal-processing algorithms and the constraints of real-time operation. The most efficient designs make use of a high degree of concurrency-a property facilitated by the microcell approach. Two circuit projects whose development resulted largely from the macrocell methodology described are used as examples throughout the report: a linear-predictive vocoder circuit, and a front-end filter-bank chip for a speech recognition system. Both are monolithic multiprocessor implementations: the lpc vocoder circuit contains three processors, the filter-bank chip two processors. 10 references.

  10. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  11. Teaching sustainable design: A collaborative process

    SciTech Connect

    Theis, C.C.

    1997-12-31

    This paper describes a collaborative educational experience in the Schools of Architecture and Landscape Architecture at Louisiana State University. During the Fall Semester of 1996 an upper-level architectural design studio worked with a peer group of landscape architecture students on the design of a master plan for an environmentally sensitive residential development on Cat Island, a barrier island located approximately eight miles south of Gulfport, Mississippi. This paper presents the methodology and results of the project, describes the collaborative process, and assesses both the viability of the design solutions and the value of the educational experience.

  12. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  13. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  14. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  15. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  16. Using scoping as a design process

    SciTech Connect

    Mulvihill, P.R. ); Jacobs, P. )

    1998-07-01

    Skillful use of the scoping phase of environment assessment (EA) is critical in cases involving a wide diversity of stakeholders and perspectives. Scoping can exert a strong influence in shaping a relevant impact assessment and increasing the probability of a process that satisfies stakeholders. This article explores key challenges facing scoping processes conducted in highly pluralistic settings. Elements of a notable case study--the scoping process conducted in 1992 for the proposed Great Whale Hydroelectric project in Northern Quebec--are discussed to illustrate innovative approaches. When used as a design process, scoping can ensure that EA reflects the different value sets and cultures that are at play, particularly where diverse knowledge systems and ways of describing environmental components and impacts exist. As it sets the stage for subsequent steps in the EA process, scoping needs to be a sufficiently broad umbrella that accommodates diverse approaches to identifying, classifying, and assessing impacts.

  17. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  18. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  19. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  20. Multiwavelet design for cardiac signal processing.

    PubMed

    Peelers, R L M; Karel, J M H; Westra, R L; Haddad, S A P; Serdijn, W A

    2006-01-01

    An approach for designing multiwavelets is introduced, for use in cardiac signal processing. The parameterization of the class of multiwavelets is in terms of associated FIR polyphase all-pass filters. Orthogonality and a balanced vanishing moment of order 1 are built into the parameterization. An optimization criterion is developed to associate the wavelets with different meaningful segments of a signal. This approach is demonstrated on the simultaneous detection of QRS-complexes and T-peaks in ECG signals. PMID:17946917

  1. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  2. Thinking and the Design Process. DIUL-RR-8414.

    ERIC Educational Resources Information Center

    Moulin, Bernard

    Designed to focus attention on the design process in such computer science activities as information systems design, database design, and expert systems design, this paper examines three main phases of the design process: understanding the context of the problem, identifying the problem, and finding a solution. The processes that these phases…

  3. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  4. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  5. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition

    PubMed Central

    Imbir, Kamil K.

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples. PMID:27486423

  6. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  7. Innovative machine designs for radiation processing

    NASA Astrophysics Data System (ADS)

    Vroom, David

    2007-12-01

    In the 1990s Raychem Corporation established a program to investigate the commercialization of several promising applications involving the combined use of its core competencies in materials science, radiation chemistry and e-beam radiation technology. The applications investigated included those that would extend Raychem's well known heat recoverable polymer and wire and cable product lines as well as new potential applications such as remediation of contaminated aqueous streams. A central part of the program was the development of new accelerator technology designed to improve quality, lower processing costs and efficiently process conformable materials such at liquids. A major emphasis with this new irradiation technology was to look at the accelerator and product handling systems as one integrated, not as two complimentary systems.

  8. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  9. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  10. Effects of Unloading and Reloading on Expressions of Skelatal Muscle Membrane Proteins in Mice

    NASA Astrophysics Data System (ADS)

    Ohno, Y.; Ikuta, A.; Goto, A.; Sugiura, T.; Ohira, Y.; Yoshioka, T.; Goto, K.

    2013-02-01

    Effects of unloading and reloading on the expression levels of tripartite motif-containing 72 (TRIM72) and caveolin-3 (Cav-3) of soleus muscle in mice were investigated. Male C57BL/6J mice (11-week old) were randomly assigned to control and hindlimb-suspended groups. Some of mice in hindlimb-suspended group were subjected to continuous hindlimb suspension (HS) for 2 weeks with or without 7 days of ambulation recovery. Following HS, the muscle weight and protein expression levels of TRIM72 and Cav-3 in soleus were decreased. On the other hand, the gradual increases in muscle mass, TRIM72 and Cav-3 were observed after reloading following HS. Therefore, it was suggested that mechanical loading played a key role in a regulatory system for protein expressions of TRIM72 and Cav-3.

  11. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  12. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  13. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  14. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is the…

  15. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  16. PROCESS DESIGN MANUAL FOR STRIPPING OF ORGANICS

    EPA Science Inventory

    Procedures and correlations for designing and costing stripping towers for the removal of organics from aqueous streams are presented. The emphasis is on practical methods suitable for engineering estimates. The designs cover steam strippers with and without condensers and reflux...

  17. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  18. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  19. Memory reloaded: memory load effects in the attentional blink.

    PubMed

    Visser, Troy A W

    2010-06-01

    When two targets are presented in rapid succession, identification of the first is nearly perfect, while identification of the second is impaired when it follows the first by less than about 700 ms. According to bottleneck models, this attentional blink (AB) occurs because the second target is unable to gain access to capacity-limited working memory processes already occupied by the first target. Evidence for this hypothesis, however, has been mixed, with recent reports suggesting that increasing working memory load does not affect the AB. The present paper explores possible reasons for failures to find a link between memory load and the AB and shows that a reliable effect of load can be obtained when the item directly after T1 (Target 1) is omitted. This finding provides initial evidence that working memory load can influence the AB and additional evidence for a link between T1 processing time and the AB predicted by bottleneck models. PMID:19787551

  20. A survey of the Oyster Creek reload licensing model

    SciTech Connect

    Alammar, M.A. )

    1991-01-01

    The Oyster Creek RETRAN licensing model was submitted for approval by the U.S. Nuclear Regulatory Commission in September 1987. This paper discusses the technical issues and concerns that were raised during the review process and how they were resolved. The technical issues are grouped into three major categories: the adequacy of the model benchmark against plant data; uncertainty analysis and model convergence with respect to various critical parameters (code correlations, nodalization, time step, etc.); and model application and usage.

  1. Space bioreactor: Design/process flow

    NASA Technical Reports Server (NTRS)

    Cross, John H.

    1987-01-01

    The design of the space bioreactor stems from three considerations. First, and foremost, it must sustain cells in microgravity. Closely related is the ability to take advantage of the weightlessness and microgravity. Lastly, it should fit into a bioprocess. The design of the space bioreactor is described in view of these considerations. A flow chart of the bioreactor is presented and discussed.

  2. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  3. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  4. Shock/reload response of water and aqueous solutions of ammonium nitrate

    NASA Astrophysics Data System (ADS)

    Morley, Mike; Williamson, David

    2011-06-01

    The response of water and aqueous solutions of ammonium nitrate to shock loading, below 10 GPa, has been experimentally investigated. In addition to determination of the principal Hugoniot, equation of state data have been measured through ``shock/reload'' experiments using a gas-gun driven plate-impact. A Mie-Grüneisen type equation of state has been applied to the liquids under investigation. The effects of initial temperature, and of weight-percentage of ammonium nitrate, on the volume-dependent Grüneisen parameter are reported.

  5. The influence of peak shock stress on the quasi-static reload response of HCP metals

    SciTech Connect

    Cerreta, E. K.; Gray, G. T. III; Trujillo, C. P.; Brown, D. W.; Tome, C. N.

    2007-12-12

    Textured, high-purity hafnium has been shock loaded at 5 and 11 GPa, below the pressure reported for the {alpha}{open_square}{omega} phase transformation, 23 GPa. The specimens were 'soft caught' for post-shock characterization. Substructure of the shocked materials was investigated through transmission electron microscopy and texture evolution due to shock loading was probed with neutron diffraction. The deformation behavior of as-annealed hafnium under quasi-static conditions was compared to its response following shock prestraining. Reload response was correlated to defect generation and storage due to shock loading and compared with observations in other HCP metals such as Ti and Zr.

  6. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  7. Moral judgment reloaded: a moral dilemma validation study

    PubMed Central

    Christensen, Julia F.; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K.; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set. PMID:25071621

  8. SETI reloaded: Next generation radio telescopes, transients and cognitive computing

    NASA Astrophysics Data System (ADS)

    Garrett, Michael A.

    2015-08-01

    The Search for Extra-terrestrial Intelligence (SETI) using radio telescopes is an area of research that is now more than 50 years old. Thus far, both targeted and wide-area surveys have yet to detect artificial signals from intelligent civilisations. In this paper, I argue that the incidence of co-existing intelligent and communicating civilisations is probably small in the Milky Way. While this makes successful SETI searches a very difficult pursuit indeed, the huge impact of even a single detection requires us to continue the search. A substantial increase in the overall performance of radio telescopes (and in particular future wide-field instruments such as the Square Kilometre Array - SKA), provide renewed optimism in the field. Evidence for this is already to be seen in the success of SETI researchers in acquiring observations on some of the world's most sensitive radio telescope facilities via open, peer-reviewed processes. The increasing interest in the dynamic radio sky, and our ability to detect new and rapid transient phenomena such as Fast Radio Bursts (FRB) is also greatly encouraging. While the nature of FRBs is not yet fully understood, I argue they are unlikely to be the signature of distant extra-terrestrial civilisations. As astronomers face a data avalanche on all sides, advances made in related areas such as advanced Big Data analytics, and cognitive computing are crucial to enable serendipitous discoveries to be made. In any case, as the era of the SKA fast approaches, the prospects of a SETI detection have never been better.

  9. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  10. Study on Product Innovative Design Process Driven by Ideal Solution

    NASA Astrophysics Data System (ADS)

    Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui

    Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.

  11. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  12. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  13. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  14. Information Architecture without Internal Theory: An Inductive Design Process.

    ERIC Educational Resources Information Center

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  15. Optimality criteria design and stress constraint processing

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1982-01-01

    Methods for pre-screening stress constraints into either primary or side-constraint categories are reviewed; a projection method, which is developed from prior cycle stress resultant history, is introduced as an additional screening parameter. Stress resultant projections are also employed to modify the traditional stress-ratio, side-constraint boundary. A special application of structural modification reanalysis is applied to the critical stress constraints to provide feasible designs that are preferable to those obtained by conventional scaling. Sample problem executions show relatively short run times and fewer design cycle iterations to achieve low structural weights; those attained are comparable to the minimum values developed elsewhere.

  16. Erlang Behaviours: Programming with Process Design Patterns

    NASA Astrophysics Data System (ADS)

    Cesarini, Francesco; Thompson, Simon

    Erlang processes run independently of each other, each using separate memory and communicating with each other by message passing. These processes, while executing different code, do so following a number of common patterns. By examining different examples of Erlang-style concurrency in client/server architectures, we identify the generic and specific parts of the code and extract the generic code to form a process skeleton. In Erlang, the most commonly used patterns have been implemented in library modules, commonly referred to as OTP behaviours. They contain the generic code framework for concurrency and error handling, simplifying the complexity of concurrent programming and protecting the developer from many common pitfalls.

  17. Design Science Research for Business Process Design: Organizational Transition at Intersport Sweden

    NASA Astrophysics Data System (ADS)

    Lind, Mikael; Rudmark, Daniel; Seigerroth, Ulf

    Business processes need to be aligned with business strategies. This paper elaborates on experiences from a business process design effort in an action research project performed at Intersport Sweden. The purpose with this project was to create a solid base for taking the retail chain Intersport into a new organizational state where the new process design is aligned with strategic goals. Although business process modeling is concerned with creating artifacts, traditionally information systems design science research has had little impact on research on business process models. In this paper, we address the question of how design science research can contribute to business process design. Three heuristic guidelines for creating organizational commitment and strategic alignment in process design are presented. The guidelines are derived from the successful actions taken in the research project. The development of these guidelines is used as a basis to reflect upon the contribution of design science research to business process design.

  18. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  19. Biochemical Engineering. Part II: Process Design

    ERIC Educational Resources Information Center

    Atkinson, B.

    1972-01-01

    Describes types of industrial techniques involving biochemical products, specifying the advantages and disadvantages of batch and continuous processes, and contrasting biochemical and chemical engineering. See SE 506 318 for Part I. (AL)

  20. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught at…

  1. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  2. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Video Gallery

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  3. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  4. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  5. INTEGRATION OF SYSTEMS ENGINEERING AND PROCESS INTENSIFICATION IN THE DESIGN OF PROCESSES FOR UTILIZING BIOBASED GLYCEROL

    EPA Science Inventory

    The expected results include an integrated process and mechanical design including a fabrication plan for the glycerol dehydration reactor, comprehensive heat and material balance, environmental impact assessment and comprehensive safety review. The resulting process design w...

  6. Algorithmic Processes for Increasing Design Efficiency.

    ERIC Educational Resources Information Center

    Terrell, William R.

    1983-01-01

    Discusses the role of algorithmic processes as a supplementary method for producing cost-effective and efficient instructional materials. Examines three approaches to problem solving in the context of developing training materials for the Naval Training Command: application of algorithms, quasi-algorithms, and heuristics. (EAO)

  7. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  8. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  9. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  10. Jemboss reloaded.

    PubMed

    Mullan, Lisa

    2004-06-01

    Bioinformatics tools are freely available from websites all over the world. Often they are presented as web services, although there are many tools for download and use on a local machine. This tutorial section looks at Jemboss, a Java-based graphical user interface (GUI) for the EMBOSS bioinformatics suite, which combines the advantages of both web service and downloaded software. PMID:15260898

  11. Rolling Reloaded

    ERIC Educational Resources Information Center

    Jones, Simon A.; Nieminen, John M.

    2008-01-01

    Not so long ago a new observation about rolling motion was described: for a rolling wheel, there is a set of points with instantaneous velocities directed at or away from the centre of the wheel; these points form a circle whose diameter connects the centre of the wheel to the wheel's point of contact with the ground (Sharma 1996 "Eur. J. Phys."…

  12. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  13. MIDAS: a framework for integrated design and manufacturing process

    NASA Astrophysics Data System (ADS)

    Chung, Moon Jung; Kwon, Patrick; Pentland, Brian

    2000-10-01

    In this paper, we present a development of a framework for managing design and manufacturing process in a distributed environment. The framework offers the following facilities: (1) to represent the complicated engineering design processes (2) to coordinate design activities and execute the process in a distributed environment and (3) to support a collaborative design by sharing data and processes. In this paper, the process flow graphs, which consist in tasks and the corresponding input and output data, are used to depict the engineering design process on a process modeling browser. The engineering activities in the represented processes can be executed in a distributed environment through the cockpit of the framework. The communication among the related engineers to support a collaborative design is made on the collaborative design browser with SML underlying data structures of representing process information to make the framework extensible and platform- independent. The formal and flexible approach of the proposed framework to integrate the engineering design processes can be also effectively applied to coordinate concurrent engineering activities in a distributed environment.

  14. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  15. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  16. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  17. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  18. Experimental design for improved ceramic processing, emphasizing the Taguchi Method

    SciTech Connect

    Weiser, M.W. . Mechanical Engineering Dept.); Fong, K.B. )

    1993-12-01

    Ceramic processing often requires substantial experimentation to produce acceptable product quality and performance. This is a consequence of ceramic processes depending upon a multitude of factors, some of which can be controlled and others that are beyond the control of the manufacturer. Statistical design of experiments is a procedure that allows quick, economical, and accurate evaluation of processes and products that depend upon several variables. Designed experiments are sets of tests in which the variables are adjusted methodically. A well-designed experiment yields unambiguous results at minimal cost. A poorly designed experiment may reveal little information of value even with complex analysis, wasting valuable time and resources. This article will review the most common experimental designs. This will include both nonstatistical designs and the much more powerful statistical experimental designs. The Taguchi Method developed by Grenichi Taguchi will be discussed in some detail. The Taguchi method, based upon fractional factorial experiments, is a powerful tool for optimizing product and process performance.

  19. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Diverting of shipments, breaking of... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to Regional Director. (a) Shipments of inspected and passed product that bear...

  20. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  1. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates. PMID:27088667

  2. Process Design Manual for Land Treatment of Municipal Wastewater.

    ERIC Educational Resources Information Center

    Crites, R.; And Others

    This manual presents a procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are given emphasis. The basic unit operations and unit processes are discussed in detail, and the design concepts and criteria are presented. The manual includes design…

  3. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  4. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  5. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  6. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  7. Design of smart imagers with image processing

    NASA Astrophysics Data System (ADS)

    Serova, Evgeniya N.; Shiryaev, Yury A.; Udovichenko, Anton O.

    2005-06-01

    This paper is devoted to creation of novel CMOS APS imagers with focal plane parallel image preprocessing for smart technical vision and electro-optical systems based on neural implementation. Using analysis of main biological vision features, the desired artificial vision characteristics are defined. Image processing tasks can be implemented by smart focal plane preprocessing CMOS imagers with neural networks are determined. Eventual results are important for medicine, aerospace ecological monitoring, complexity, and ways for CMOS APS neural nets implementation. To reduce real image preprocessing time special methods based on edge detection and neighbored frame subtraction will be considered and simulated. To select optimal methods and mathematical operators for edge detection various medical, technical and aerospace images will be tested. The important research direction will be devoted to analogue implementation of main preprocessing operations (addition, subtraction, neighbored frame subtraction, module, and edge detection of pixel signals) in focal plane of CMOS APS imagers. We present the following results: the algorithm of edge detection for analog realization, and patented focal plane circuits for analog image reprocessing (edge detection and motion detection).

  8. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  9. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  10. 32nm design rule and process exploration flow

    NASA Astrophysics Data System (ADS)

    Zhang, Yunqiang; Cobb, Jonathan; Yang, Amy; Li, Ji; Lucas, Kevin; Sethi, Satyendra

    2008-10-01

    Semiconductor manufacturers spend hundreds of millions of dollars and years of development time to create a new manufacturing process and to design frontrunner products to work on the new process. A considerable percentage of this large investment is aimed at producing the process design rules and related lithography technology to pattern the new products successfully. Significant additional cost and time is needed in both process and design development if the design rules or lithography strategy must be modified. Therefore, early and accurate prediction of both process design rules and lithography options is necessary for minimizing cost and timing in semiconductor development. This paper describes a methodology to determine the optimum design rules and lithography conditions with high accuracy early in the development lifecycle. We present results from the 32nm logic node but the methodology can be extended to the 22nm node or any other node. This work involves: automated generation of extended realistic logic test layouts utilizing programmed teststructures for a variety of design rules; determining a range of optical illumination and process conditions to test for each critical design layer; using these illumination conditions to create a extrapolatable process window OPC model which is matched to rigorous TCAD lithography focus-exposure full chemically amplified resist models; creating reticle enhancement technique (RET) recipes which are flexible enough to be used over a variety of design rule and illumination conditions; OPC recipes which are flexible enough to be used over a variety of design rule and illumination conditions; and OPC verification to find, categorize and report all patterning issues found in the different design and illumination variations. In this work we describe in detail the individual steps in the methodology, and provide results of its use for 32nm node design rule and process optimization.

  11. Integration of MGDS design into the licensing process

    SciTech Connect

    1997-12-01

    This paper presents an overview of how the Mined Geologic Disposal System (MGDS) design for a potential repository is integrated into the licensing process. The integration process employs a two-told approach: (1) ensure that the MGDS design complies with applicable Nuclear Regulatory Commission (NRC) licensing requirements, and (2) ensure that the MGDS design is appropriately reflected in a license application that is acceptable to the NRC for performing acceptance and compliance reviews.

  12. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified. PMID:20589669

  13. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  14. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  15. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. PMID:24616438

  16. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, S.D.

    1998-07-01

    The sequential manner in which materials and processes for a manufactured product are selected is inherently less than optimal. Designers` tendency to choose processes and materials with which they are familiar exacerbate this problem. A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach is presented.

  17. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  18. Dynamic Characteristics Analysis of Analogue Networks Design Process

    NASA Astrophysics Data System (ADS)

    Zemliak, Alexander M.

    The process of designing analogue circuits is formulated as a controlled dynamic system. For analysis of such system's properties it is suggested to use the concept of Lyapunov's function for a dynamic system. Various forms of Lyapunov's function are suggested. Analyzing the behavior of Lyapunov's function and its first derivative allowed us to determine significant correlation between this function's properties and processor time used to design the circuit. Numerical results prove the possibility of forecasting the behavior of various designing strategies and processor time based on the properties of Lyapunov's function for the process of designing the circuit.

  19. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  20. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  1. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  2. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  3. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  4. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  5. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  6. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  7. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  8. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  9. Design, control and in situ visualization of gas nitriding processes.

    PubMed

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  10. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  11. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  12. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats

    NASA Astrophysics Data System (ADS)

    Riley, D. A.

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  13. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  14. Application of hazard assessment techniques in the CISF design process

    SciTech Connect

    Thornton, J.R.; Henry, T.

    1997-10-29

    The Department of Energy has submitted to the NRC staff for review a topical safety analysis report (TSAR) for a Centralized Interim Storage Facility (CISF). The TSAR will be used in licensing the CISF when and if a site is designated. CISF1 design events are identified based on thorough review of design basis events (DBEs) previously identified by dry storage system suppliers and licensees and through the application of hazard assessment techniques. A Preliminary Hazards Assessment (PHA) is performed to identify design events applicable to a Phase 1 non site specific CISF. A PHA is deemed necessary since the Phase 1 CISF is distinguishable from previous dry store applications in several significant operational scope and design basis aspects. In addition to assuring all design events applicable to the Phase 1 CISF are identified, the PHA served as an integral part of the CISF design process by identifying potential important to safety and defense in depth facility design and administrative control features. This paper describes the Phase 1 CISF design event identification process and summarizes significant PHA contributions to the CISF design.

  15. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  16. COMPUTER ASSISTED PRELIMINARY DESIGN FOR DRINKING WATER TREATMENT PROCESS SYSTEMS

    EPA Science Inventory

    The purpose of the study was to develop an interactive computer program to aid the design engineer in evaluating the performance and cost for any proposed drinking water treatment system consisting of individual unit processes. The 25 unit process models currently in the program ...

  17. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  18. A computer-assisted process for supersonic aircraft conceptual design

    NASA Technical Reports Server (NTRS)

    Johnson, V. S.

    1985-01-01

    Design methodology was developed and existing major computer codes were selected to carry out the conceptual design of supersonic aircraft. A computer-assisted design process resulted from linking the codes together in a logical manner to implement the design methodology. The process does not perform the conceptual design of a supersonic aircraft but it does provide the designer with increased flexibility, especially in geometry generation and manipulation. Use of the computer-assisted process for the conceptual design of an advanced technology Mach 3.5 interceptor showed the principal benefit of the process to be the ability to use a computerized geometry generator and then directly convert the geometry between formats used in the geometry code and the aerodynamics codes. Results from the interceptor study showed that a Mach 3.5 standoff interceptor with a 1000 nautical-mile mission radius and a payload of eight Phoenix missiles appears to be feasible with the advanced technologies considered. A sensitivity study showed that technologies affecting the empty weight and propulsion system would be critical in the final configuration characteristics with aerodynamics having a lesser effect for small perturbations around the baseline.

  19. Incorporating manufacturability constraints into the design process of heterogeneous objects

    NASA Astrophysics Data System (ADS)

    Hu, Yuna; Blouin, Vincent Y.; Fadel, Georges M.

    2004-11-01

    Rapid prototyping (RP) technology, such as Laser Engineering Net Shaping (LENSTM), can be used to fabricate heterogeneous objects with gradient variations in material composition. These objects are generally characterized by enhanced functional performance. Past research on the design of such objects has focused on representation, modeling, and functional performance. However, the inherent constraints in RP processes, such as system capability and processing time, lead to heterogeneous objects that may not meet the designer's original intent. To overcome this situation, the research presented in this paper focuses on the identification and implementation of manufacturing constraints into the design process. A node-based finite element modeling technique is used for the representation and analysis and the multicriteria design problem corresponds to finding the nodal material compositions that minimize structural weight and maximize thermal performance. The optimizer used in this research is a real-valued Evolutionary Strategies (ES), which is well suited for this type of multi-modal problem. Two limitations of the LENS manufacturing process, which have an impact on the design process, are identified and implemented. One of them is related to the manufacturing time, which is considered as an additional criterion to be minimized in the design problem for a preselected tool path. A brake disc rotor made of two materials, aluminum for lightweight and steel for superior thermal characteristics, is used to illustrate the tradeoff between manufacturability and functionality.

  20. Evaluation of the FCHART/SLR solar design process

    SciTech Connect

    Fanning, M.W.

    1982-01-01

    The actual heating requirements of 137 passive solar houses were compared to the requirements predicted by a typical design process that used the FCHART/SLR design tool. The calculation of the actual space-heating auxiliary energy needed by the houses during the 1979-1980 heating season was based on fuel bills and appliance use information. The prediction for each residence relied on site-specific weather data for that period, on owner-estimated occupancy patterns, and on measured building characteristics. FCHART/SLR was used with this information to predict the space-heating auxiliary. A statistical comparison of the actual and predicted auxiliaries showed that the design process overpredicted the auxiliary requirement by 60% with 112% standard deviation. A simple heat-loss calculation that ignored the solar contribution proved as accurate a predictor of the heating requirement as the solar design process in some cases.

  1. Natural gas operations: considerations on process transients, design, and control.

    PubMed

    Manenti, Flavio

    2012-03-01

    This manuscript highlights tangible benefits deriving from the dynamic simulation and control of operational transients of natural gas processing plants. Relevant improvements in safety, controllability, operability, and flexibility are obtained not only within the traditional applications, i.e. plant start-up and shutdown, but also in certain fields apparently time-independent such as the feasibility studies of gas processing plant layout and the process design of processes. Specifically, this paper enhances the myopic steady-state approach and its main shortcomings with respect to the more detailed studies that take into consideration the non-steady state behaviors. A portion of a gas processing facility is considered as case study. Process transients, design, and control solutions apparently more appealing from a steady-state approach are compared to the corresponding dynamic simulation solutions. PMID:22056010

  2. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals. PMID:16604700

  3. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  4. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  5. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  6. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, Stephen D.; Knorovsky, Gerald A.

    2000-08-16

    A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach facilitates the product design and manufacturing process. Using a Windows-based computer video display and a data base of materials and their properties, the designer can ascertain the preferred composition of two parts based on various operating/environmental constraints such as load, temperature, lifetime, etc. Optimum joinder of the two parts may simultaneously be determined using a joining process data base based upon the selected composition of the components as well as the operating/environmental constraints.

  7. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  8. Design considerations for fume hoods for process plants.

    PubMed

    Goodfellow, H D; Bender, M

    1980-07-01

    Proper design of fume hoods is a necessary requisite for a clean working environment for many industrial processes. Until recently, the design of these hoods has been rather a trial and error approach and not based on sound engineering design principles. Hatch Associates have developed and applied new techniques to establish hood parameters for different industrail processes. The paper reviews the developed techniques and illustrates practical application of these techniques to the solving of difficult and comples fume hood design and operating performance problems. The scope of the paper covers the following subject areas: definitions and general considerations: evaluation of volume and heat flow rates for emission sources; local capture of process emissions; remote capture of process emissions and case studies of fume hood applications. The purpose of the paper is to detail a coherent approach in the analysis of emission problems which will result in the development of an efficient design of a fume capture hood. An efficient fume hood can provide a safe working place as well as a clean external environment. Although the techniques can be applied to smaller sources, the case studies which will be examined will be for fume hoods in the flow design range of 50 000 CFM to +1 000 000 CFM. PMID:7415967

  9. Parametric design methodology for chemical processes using a simulator

    SciTech Connect

    Diwekar, U.M.; Rubin, E.S. )

    1994-02-01

    Parameter design is a method popularized by the Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust in the face of uncontrollable variations. At the design stage, the goal of parameter design is to identify design settings that make the product performance less sensitive to the effects of manufacturing and environmental variations and deterioration. Because parameter design reduces performance variation by reducing the influence of the sources of variation rather than by controlling them, it is a cost-effective technique for improving quality. A recent study on the application of parameter design methodology for chemical processes reported that the use of Taguchi's method was not justified and a method based on Monte Carlo simulation combined with optimization was shown to be more effective. However, this method is computationally intensive as a large number of samples are necessary to achieve the given accuracy. Additionally, determination of the number of sample runs required is based on experimentation due to a lack of systematic sampling methods. In an attempt to overcome these problems, the use of a stochastic modeling capability combined with an optimizer is presented in this paper. The objective is that of providing an effective means for application of parameter design methodologies to chemical processes using the ASPEN simulator. This implementation not only presents a generalized tool for use by chemical engineers at large but also provides systematic estimates of the number of sample runs required to attain the specified accuracy. The stochastic model employs the technique of Latin hypercube sampling instead of the traditional Monte Carlo technique and hence has a great potential to reduce the required number of samples. The methodology is illustrated via an example problem of designing a chemical process.

  10. Design of a Pu-238 Waste Incineration Process

    SciTech Connect

    Charlesworth, D.L.

    2001-05-29

    Combustible Pu-238 waste is generated as a result of normal operation and decommissioning activity at the Savannah River Plant and is being retrievably stored there. As part of the long-term plan to process the stored waste and current waste in preparation for future disposition, a Pu-238 incineration process is being cold-tested at Savannah River Laboratory (SRL). The incineration process consists of a continuous-feed preparation system, a two-stage, electrically fired incinerator, and a filtration off-gas system. Process equipment has been designed, fabricated, and installed for nonradioactive testing and cold run-in. Design features to maximize the ability to remotely maintain the equipment were incorporated into the process. Interlock, alarm, and control functions are provided by a programmable controller. Cold testing is scheduled to be completed in 1986.

  11. Glucose uptake in rat soleus - Effect of acute unloading and subsequent reloading

    NASA Technical Reports Server (NTRS)

    Henriksen, Eric J.; Tischler, Marc E.

    1988-01-01

    The effect of acutely reduced weight bearing (unloading) on the in vitro uptake of 2-1,2-H-3-deoxy-D-glucose was studied in the soleus muscle by tail casting and suspending rats. After just 4 h, the uptake of 2-deoxy-D-glucose fell (-19 percent) and declined further after an additional 20 h of unloading. This diminution at 24 h was associated with slower oxidation of C-14-glucose and incorporation of C-14-glucose into glycogen. At 3 days of unloading, basal uptake of 2-deoxy-D-glucose did not differ from control. Reloading of the soleus after 1 or 3 days of unloading increased uptake of 2-deoxy-D-glucose above control and returned it to normal within 6 h and 4 days, respectively. These effects of unloading and recovery were caused by local changes in the soleus, because the extensor digitorum longus from the same hindlimbs did not display any alterations in uptake of 2-deoxy-D-glucose or metabolism of glucose.

  12. Expertise in professional software design: a process study.

    PubMed

    Sonnentag, S

    1998-10-01

    Forty professional software designers participated in a study in which they worked on a software design task and reported strategies for accomplishing that task. High performers were identified by a peer-nomination method and performance on a design. Verbal protocol analysis based on a comparison of 12 high and 12 moderate performers indicated that high performers structured their design process by local planning and showed more feedback processing, whereas moderate performers were more engaged in analyzing requirements and verbalizing task-irrelevant cognitions. High performers more often described problem comprehension and cooperation with colleagues as useful strategies. High and moderate performers did not differ with respect to length of experience. None of the differences between the two performance groups could be explained by length of experience. PMID:9806013

  13. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  14. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  15. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  16. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  17. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  18. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  19. Rethinking ASIC design with next generation lithography and process integration

    NASA Astrophysics Data System (ADS)

    Vaidyanathan, Kaushik; Liu, Renzhi; Liebmann, Lars; Lai, Kafai; Strojwas, Andrzej; Pileggi, Larry

    2013-03-01

    Given the deployment delays for EUV, several next generation lithography (NGL) options are being actively researched. Several cost-effective NGL solutions, such as self-aligned double patterning through sidewall image transfer (SIT) and directed self-assembly (DSA), in conjunction with process integration challenges, mandate grating-like pattern design. As part of the GRATEdd project, we have evaluated the design cost of grating-based design for ASICs (application specific ICs). Based on our observations we have engineered fundamental changes to the primary ASIC design components to make scaling affordable and useful in deeply scaled sub-20 nm technologies: unidirectional-M1 based standard cells, application-specific smart SRAM synthesis, and statistical and self-healing analog design.

  20. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  1. Rethinking behavioral health processes by using design for six sigma.

    PubMed

    Lucas, Anthony G; Primus, Kelly; Kovach, Jamison V; Fredendall, Lawrence D

    2015-02-01

    Clinical evidence-based practices are strongly encouraged and commonly utilized in the behavioral health community. However, evidence-based practices that are related to quality improvement processes, such as Design for Six Sigma, are often not used in behavioral health care. This column describes the unique partnership formed between a behavioral health care provider in the greater Pittsburgh area, a nonprofit oversight and monitoring agency for behavioral health services, and academic researchers. The authors detail how the partnership used the multistep process outlined in Design for Six Sigma to completely redesign the provider's intake process. Implementation of the redesigned process increased access to care, decreased bad debt and uncollected funds, and improved cash flow--while consumer satisfaction remained high. PMID:25642607

  2. Integrated Design System (IDS) Tools for the Spacecraft Aeroassist/Entry Vehicle Design Process

    NASA Technical Reports Server (NTRS)

    Olynick, David; Braun, Robert; Langhoff, Steven R. (Technical Monitor)

    1997-01-01

    The definition of the Integrated Design System technology focus area as presented in the NASA Information Technology center of excellence strategic plan is described. The need for IDS tools in the aeroassist/entry vehicle design process is illustrated. Initial and future plans for spacecraft IDS tool development are discussed.

  3. Informed Design: A Contemporary Approach to Design Pedagogy as the Core Process in Technology

    ERIC Educational Resources Information Center

    Burghardt, M. David; Hacker, Michael

    2004-01-01

    In classroom settings, most problems are usually well defined, so students have little experience with open-ended problems. Technological design problems, however, are seldom well defined. The design process begins with broad ideas and concepts and continues in the direction of ever-increasing detail, resulting in an acceptable solution. So using…

  4. Motivating the Notion of Generic Design within Information Processing Theory: The Design Problem Space.

    ERIC Educational Resources Information Center

    Goel, Vinod; Pirolli, Peter

    The notion of generic design, while it has been around for 25 years, is not often articulated, especially within Newell and Simon's (1972) Information Processing Theory framework. Design is merely lumped in with other forms of problem solving activity. Intuitively it is felt that there should be a level of description of the phenomenon which…

  5. Which Events Can Cause Iteration in Instructional Design? An Empirical Study of the Design Process

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2006-01-01

    Instructional design is not a linear process: designers have to weigh the advantages and disadvantages of alternative solutions, taking into account different kinds of conflicting and changing constraints. To make sure that they eventually choose the most optimal one, they have to keep on collecting information, reconsidering continuously whether…

  6. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  7. Inverse Analysis to Formability Design in a Deep Drawing Process

    NASA Astrophysics Data System (ADS)

    Buranathiti, Thaweepat; Cao, Jian

    Deep drawing process is an important process adding values to flat sheet metals in many industries. An important concern in the design of a deep drawing process generally is formability. This paper aims to present the connection between formability and inverse analysis (IA), which is a systematical means for determining an optimal blank configuration for a deep drawing process. In this paper, IA is presented and explored by using a commercial finite element software package. A number of numerical studies on the effect of blank configurations to the quality of a part produced by a deep drawing process were conducted and analyzed. The quality of the drawing processes is numerically analyzed by using an explicit incremental nonlinear finite element code. The minimum distance between elemental principal strains and the strain-based forming limit curve (FLC) is defined as tearing margin to be the key performance index (KPI) implying the quality of the part. The initial blank configuration has shown that it plays a highly important role in the quality of the product via the deep drawing process. In addition, it is observed that if a blank configuration is not greatly deviated from the one obtained from IA, the blank can still result a good product. The strain history around the bottom fillet of the part is also observed. The paper concludes that IA is an important part of the design methodology for deep drawing processes.

  8. Noise control, sound, and the vehicle design process

    NASA Astrophysics Data System (ADS)

    Donavan, Paul

    2005-09-01

    For many products, noise and sound are viewed as necessary evils that need to be dealt with in order to bring the product successfully to market. They are generally not product ``exciters'' although some vehicle manufacturers do tune and advertise specific sounds to enhance the perception of their products. In this paper, influencing the design process for the ``evils,'' such as wind noise and road noise, are considered in more detail. There are three ingredients to successfully dealing with the evils in the design process. The first of these is knowing how excesses in noise effects the end customer in a tangible manner and how that effects customer satisfaction and ultimately sells. The second is having and delivering the knowledge of what is required of the design to achieve a satisfactory or even better level of noise performance. The third ingredient is having the commitment of the designers to incorporate the knowledge into their part, subsystem or system. In this paper, the elements of each of these ingredients are discussed in some detail and the attributes of a successful design process are enumerated.

  9. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  10. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  11. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  12. Experiential Learning: A Course Design Process for Critical Thinking

    ERIC Educational Resources Information Center

    Hamilton, Janet G.; Klebba, Joanne M.

    2011-01-01

    This article describes a course design process to improve the effectiveness of using experiential learning techniques to foster critical thinking skills. The authors examine prior research to identify essential dimensions of experiential learning in relation to higher order thinking. These dimensions provide key insights for the selection of…

  13. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  14. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Approximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use this infor...

  15. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Aproximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use tis inform...

  16. USING GENETIC ALGORITHMS TO DESIGN ENVIRONMENTALLY FRIENDLY PROCESSES

    EPA Science Inventory

    Genetic algorithm calculations are applied to the design of chemical processes to achieve improvements in environmental and economic performance. By finding the set of Pareto (i.e., non-dominated) solutions one can see how different objectives, such as environmental and economic ...

  17. Ingenuity in Action: Connecting Tinkering to Engineering Design Processes

    ERIC Educational Resources Information Center

    Wang, Jennifer; Werner-Avidon, Maia; Newton, Lisa; Randol, Scott; Smith, Brooke; Walker, Gretchen

    2013-01-01

    The Lawrence Hall of Science, a science center, seeks to replicate real-world engineering at the "Ingenuity in Action" exhibit, which consists of three open-ended challenges. These problems encourage children to engage in engineering design processes and problem-solving techniques through tinkering. We observed and interviewed 112…

  18. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    ERIC Educational Resources Information Center

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  19. The Role of Dialogic Processes in Designing Career Expectations

    ERIC Educational Resources Information Center

    Bangali, Marcelline; Guichard, Jean

    2012-01-01

    This article examines the role played by dialogic processes in the designing or redesigning of future expectations during a career guidance intervention. It discusses a specific method ("Giving instruction to a double") developed and used during career counseling sessions with two recent doctoral graduates. It intends both to help them outline or…

  20. GREENING OF OXIDATION CATALYSIS THROUGH IMPROVED CATALYST AND PROCESS DESIGN

    EPA Science Inventory


    Greening of Oxidation Catalysis Through Improved Catalysts and Process Design
    Michael A. Gonzalez*, Thomas Becker, and Raymond Smith

    United State Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory, 26 W...

  1. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product…

  2. Processing and circuit design enhance a data converter's radiation tolerance

    SciTech Connect

    Heuner, R.; Zazzu, V.; Pennisi, L.

    1988-12-01

    Rad-hard CMOS/SOS processing has been applied to a novel comparator-inverter circuit design to develop 6 and 8-bit parallel (flash) ADC (analog-to-digital converter) circuits featuring high-speed operation, low power consumption, and total-dose radiation tolerances up to 1 Mrad(Si).

  3. Developing 21st Century Process Skills through Project Design

    ERIC Educational Resources Information Center

    Yoo, Jeong-Ju; MacDonald, Nora M.

    2014-01-01

    The goal of this paper is to illustrate how the promotion of 21st Century process skills can be used to enhance student learning and workplace skill development: thinking, problem solving, collaboration, communication, leadership, and management. As an illustrative case, fashion merchandising and design students conducted research for a…

  4. INCORPORATING INDUSTRIAL ECOLOGY INTO HIERARCHICAL CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    Incorporating Industrial Ecology into Hierarchical Chemical Process Design: Determining Targets for the Exchange of Waste

    The exchange of waste to be used as a recycled feed has long been encouraged by practitioners of industrial ecology. Industrial ecology is a field t...

  5. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  6. PRELIMINARY DESIGN FOR DRINKING WATER TREATMENT PROCESS SYSTEMS

    EPA Science Inventory

    A computer model has been developed for use in estimating the performance and associated costs of proposed and existing water supply systems. Design procedures and cost-estimating relationships for 25 unit processes that can be used for drinking water treatment are contained with...

  7. Design characteristics for facilities which process hazardous particulate

    SciTech Connect

    Abeln, S.P.; Creek, K.; Salisbury, S.

    1998-12-01

    Los Alamos National Laboratory is establishing a research and processing capability for beryllium. The unique properties of beryllium, including light weight, rigidity, thermal conductivity, heat capacity, and nuclear properties make it critical to a number of US defense and aerospace programs. Concomitant with the unique engineering properties are the health hazards associated with processing beryllium in a particulate form and the potential for worker inhalation of aerosolized beryllium. Beryllium has the lowest airborne standard for worker protection compared to all other nonradioactive metals by more than an order of magnitude. This paper describes the design characteristics of the new beryllium facility at Los Alamos as they relate to protection of the workforce. Design characteristics to be reviewed include; facility layout, support systems to minimize aerosol exposure and spread, and detailed review of the ventilation system design for general room air cleanliness and extraction of particulate at the source.

  8. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature. PMID:19458728

  9. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    PubMed

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  10. The role of CFD in the design process

    NASA Astrophysics Data System (ADS)

    Jennions, Ian K.

    1994-05-01

    Over the last decade the role played by CFD codes in turbomachinery design has changed remarkably. While convergence/stability or even the existence of unique solutions was discussed fervently ten years ago, CFD codes now form a valuable part of an overall integrated design system and have caused us to re-think much of what we do. The geometric and physical complexities addressed have also evolved, as have the number of software houses competing with in-house developers to provide solutions to daily design problems. This paper reviews how GE Aircraft Engines (GEAE) uses CFD in the turbomachinery design process and examines many of the issues faced in successful code implementation.

  11. Computer aided microbial safety design of food processes.

    PubMed

    Schellekens, M; Martens, T; Roberts, T A; Mackey, B M; Nicolaï, B M; Van Impe, J F; De Baerdemaeker, J

    1994-12-01

    To reduce the time required for product development, to avoid expensive experimental tests, and to quantify safety risks for fresh products and the consequence of processing there is a growing interest in computer aided food process design. This paper discusses the application of hybrid object-oriented and rule-based expert system technology to represent the data and knowledge of microbial experts and food engineers. Finite element models for heat transfer calculation routines, microbial growth and inactivation models and texture kinetics are combined with food composition data, thermophysical properties, process steps and expert knowledge on type and quantity of microbial contamination. A prototype system has been developed to evaluate changes in food composition, process steps and process parameters on microbiological safety and textual quality of foods. PMID:7703003

  12. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  13. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  14. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  15. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  16. Designed CVD growth of graphene via process engineering.

    PubMed

    Yan, Kai; Fu, Lei; Peng, Hailin; Liu, Zhongfan

    2013-10-15

    Graphene, the atomic thin carbon film with honeycomb lattice, holds great promise in a wide range of applications, due to its unique band structure and excellent electronic, optical, mechanical, and thermal properties. Scientists are researching this star material because of the development of various emerging preparation techniques, among which chemical vapor deposition (CVD) has received the fastest advances in the past few years. For the CVD growth of graphene, the ultimate goal is to achieve the highest quality in the largest scale and lowest cost with a precise control of layer thickness, stacking order, and crystallinity. To meet this goal, researchers need a comprehensive understanding and effective controlling of the growth process, especially to its elementary steps. In this Account, we focus on our recent progresses toward the controlled surface growth of graphene and its two-dimensional (2D) hybrids via rational designs of CVD elementary processes, namely, process engineering. A typical CVD process consists of four main elementary steps: (A) adsorption and catalytic decomposition of precursor gas, (B) diffusion and dissolution of decomposed carbon species into bulk metal, (C) segregation of dissolved carbon atoms onto the metal surface, and finally, (D) surface nucleation and growth of graphene. Absence or enhancement of each elementary step would lead to significant changes in the whole growth process. Metals with certain carbon solubility, such as nickel and cobalt, involve all four elementary steps in a typical CVD process, thus providing us an ideal system for process engineering. The elementary segregation process can be completely blocked if molybdenum is introduced into the system as an alloy catalyst, yielding perfect monolayer graphene almost independent of growth parameters. On the other hand, the segregation-only process of predissolved solid carbons is also capable of high-quality graphene growth. By using a synergetic Cu-Ni alloy, we are

  17. Innovative soil treatment process design for removal of trivalent chromium

    SciTech Connect

    Stallings, J.H.; Durkin, M.E.

    1997-12-31

    A soil treatment process has been developed as part of a US Air Force environmental compliance project at Air Force Plant 44, Tucson, AZ for treating soil contaminated with heavy metals including trivalent chromium, cadmium, copper, and nickel. The process was designed to treat a total of 133,000 tons of soil in a 400 ton per day facility. Features of the soil treatment process include physical treatment and separation, and a chemical treatment process of the remaining fines using a hypochlorite leach allowing chromium to be solubilized at a high pH. After treating, fines are washed in three stage countercurrent thickeners and chromium hydroxide cake is recovered as a final produce from the leach solution. Treatability studies were conducted, laboratory and a pilot plant was built. Process design criteria and flow sheet, material balances, as well as preliminary equipment selection and sizing for the facility have been completed. Facility was designed for the removal of Cr at a concentration of an average of 1230 mg/kg from the soil and meeting a risk based clean-closure limit of 400 mg/kg of Cr. Capital costs for the 400 tpd plant were estimated at 9.6 million with an operating and maintenance cost of $54 per ton As process is most economic for large quantities of soil with relatively low concentrations of contaminants, it was not used in final closure when the estimated volume of contaminated soil removed dropped to 65,000 tons and concentration of chromium increased up to 4000 mg/kg. However, the process could have application in situations where economics and location warrant.

  18. Designing large-scale conservation corridors for pattern and process.

    PubMed

    Rouget, Mathieu; Cowling, Richard M; Lombard, Amanda T; Knight, Andrew T; Kerley, Graham I H

    2006-04-01

    A major challenge for conservation assessments is to identify priority areas that incorporate biological patterns and processes. Because large-scale processes are mostly oriented along environmental gradients, we propose to accommodate them by designing regional-scale corridors to capture these gradients. Based on systematic conservation planning principles such as representation and persistence, we identified large tracts of untransformed land (i.e., conservation corridors) for conservation that would achieve biodiversity targets for pattern and process in the Subtropical Thicket Biome of South Africa. We combined least-cost path analysis with a target-driven algorithm to identify the best option for capturing key environmental gradients while considering biodiversity targets and conservation opportunities and constraints. We identified seven conservation corridors on the basis of subtropical thicket representation, habitat transformation and degradation, wildlife suitability, irreplaceability of vegetation types, protected area networks, and future land-use pressures. These conservation corridors covered 21.1% of the planning region (ranging from 600 to 5200 km2) and successfully achieved targets for biological processes and to a lesser extent for vegetation types. The corridors we identified are intended to promote the persistence of ecological processes (gradients and fixed processes) and fulfill half of the biodiversity pattern target. We compared the conservation corridors with a simplified corridor design consisting of a fixed-width buffer along major rivers. Conservation corridors outperformed river buffers in seven out of eight criteria. Our corridor design can provide a tool for quantifying trade-offs between various criteria (biodiversity pattern and process, implementation constraints and opportunities). A land-use management model was developed to facilitate implementation of conservation actions within these corridors. PMID:16903115

  19. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  20. Operation and design of selected industrial process heat field tests

    SciTech Connect

    Kearney, D. W.

    1981-02-01

    The DOE program of solar industrial process heat field tests has shown solar energy to be compatible with numerous industrial needs. Both the operational projects and the detailed designs of systems that are not yet operational have resulted in valuable insights into design and hardware practice. Typical of these insights are the experiences discussed for the four projects reviewed. Future solar IPH systems should benefit greatly not only from the availability of present information, but also from the wealth of operating experience from projects due to start up in 1981.

  1. A Taguchi study of the aeroelastic tailoring design process

    NASA Technical Reports Server (NTRS)

    Bohlmann, Jonathan D.; Scott, Robert C.

    1991-01-01

    A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.

  2. Design of the HTGR for process heat applications

    SciTech Connect

    Vrable, D.L.; Quade, R.N.

    1980-05-01

    This paper discusses a design study of an advanced 842-MW(t) HTGR with a reactor outlet temperature of 850/sup 0/C (1562/sup 0/F), coupled with a chemical process whose product is hydrogen (or a mixture of hydrogen and carbon monoxide) generated by steam reforming of a light hydrocarbon mixture. This paper discusses the plant layout and design for the major components of the primary and secondary heat transfer systems. Typical parametric system study results illustrate the capability of a computer code developed to model the plant performance and economics.

  3. Remote Maintenance Design Guide for Compact Processing Units

    SciTech Connect

    Draper, J.V.

    2000-07-13

    Oak Ridge National Laboratory (ORNL) Robotics and Process Systems (RPSD) personnel have extensive experience working with remotely operated and maintained systems. These systems require expert knowledge in teleoperation, human factors, telerobotics, and other robotic devices so that remote equipment may be manipulated, operated, serviced, surveyed, and moved about in a hazardous environment. The RPSD staff has a wealth of experience in this area, including knowledge in the broad topics of human factors, modular electronics, modular mechanical systems, hardware design, and specialized tooling. Examples of projects that illustrate and highlight RPSD's unique experience in remote systems design and application include the following: (1) design of a remote shear and remote dissolver systems in support of U.S. Department of Energy (DOE) fuel recycling research and nuclear power missions; (2) building remotely operated mobile systems for metrology and characterizing hazardous facilities in support of remote operations within those facilities; (3) construction of modular robotic arms, including the Laboratory Telerobotic Manipulator, which was designed for the National Aeronautics and Space Administration (NASA) and the Advanced ServoManipulator, which was designed for the DOE; (4) design of remotely operated laboratories, including chemical analysis and biochemical processing laboratories; (5) construction of remote systems for environmental clean up and characterization, including underwater, buried waste, underground storage tank (UST) and decontamination and dismantlement (D&D) applications. Remote maintenance has played a significant role in fuel reprocessing because of combined chemical and radiological contamination. Furthermore, remote maintenance is expected to play a strong role in future waste remediation. The compact processing units (CPUs) being designed for use in underground waste storage tank remediation are examples of improvements in systems processing

  4. A Review of the Design Process for Implantable Orthopedic Medical Devices

    PubMed Central

    Aitchison, G.A; Hukins, D.W.L; Parry, J.J; Shepherd, D.E.T; Trotman, S.G

    2009-01-01

    The design process for medical devices is highly regulated to ensure the safety of patients. This paper will present a review of the design process for implantable orthopedic medical devices. It will cover the main stages of feasibility, design reviews, design, design verification, manufacture, design validation, design transfer and design changes. PMID:19662153

  5. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  6. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  7. Process Design Concepts for Stabilization of High Level Waste Calcine

    SciTech Connect

    T. R. Thomas; A. K. Herbst

    2005-06-01

    The current baseline assumption is that packaging ¡§as is¡¨ and direct disposal of high level waste (HLW) calcine in a Monitored Geologic Repository will be allowed. The fall back position is to develop a stabilized waste form for the HLW calcine, that will meet repository waste acceptance criteria currently in place, in case regulatory initiatives are unsuccessful. A decision between direct disposal or a stabilization alternative is anticipated by June 2006. The purposes of this Engineering Design File (EDF) are to provide a pre-conceptual design on three low temperature processes under development for stabilization of high level waste calcine (i.e., the grout, hydroceramic grout, and iron phosphate ceramic processes) and to support a down selection among the three candidates. The key assumptions for the pre-conceptual design assessment are that a) a waste treatment plant would operate over eight years for 200 days a year, b) a design processing rate of 3.67 m3/day or 4670 kg/day of HLW calcine would be needed, and c) the performance of waste form would remove the HLW calcine from the hazardous waste category, and d) the waste form loadings would range from about 21-25 wt% calcine. The conclusions of this EDF study are that: (a) To date, the grout formulation appears to be the best candidate stabilizer among the three being tested for HLW calcine and appears to be the easiest to mix, pour, and cure. (b) Only minor differences would exist between the process steps of the grout and hydroceramic grout stabilization processes. If temperature control of the mixer at about 80„aC is required, it would add a major level of complexity to the iron phosphate stabilization process. (c) It is too early in the development program to determine which stabilizer will produce the minimum amount of stabilized waste form for the entire HLW inventory, but the volume is assumed to be within the range of 12,250 to 14,470 m3. (d) The stacked vessel height of the hot process vessels

  8. Robust process design and springback compensation of a decklid inner

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaojing; Grimm, Peter; Carleer, Bart; Jin, Weimin; Liu, Gang; Cheng, Yingchao

    2013-12-01

    Springback compensation is one of the key topics in current die face engineering. The accuracy of the springback simulation, the robustness of method planning and springback are considered to be the main factors which influences the effectiveness of springback compensation. In the present paper, the basic principles of springback compensation are presented firstly. These principles consist of an accurate full cycle simulation with final validation setting and the robust process design and optimization are discussed in detail via an industrial example, a decklid inner. Moreover, an effective compensation strategy is put forward based on the analysis of springback and the simulation based springback compensation is introduced in the phase of process design. In the end, the final verification and comparison in tryout and production is given in this paper, which verified that the methodology of robust springback compensation is effective during the die development.

  9. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  10. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.