Science.gov

Sample records for reload design process

  1. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  2. Modeling and design of a reload PWR core for a 48-month fuel cycle

    SciTech Connect

    McMahon, M.V.; Driscoll, M.J.; Todreas, N.E.

    1997-05-01

    The objective of this research was to use state-of-the-art nuclear and fuel performance packages to evaluate the feasibility and costs of a 48 calendar month core in existing pressurized water reactor (PWR) designs, considering the full range of practical design and economic considerations. The driving force behind this research is the desire to make nuclear power more economically competitive with fossil fuel options by expanding the scope for achievement of higher capacity factors. Using CASMO/SIMULATE, a core design with fuel enriched to 7{sup w}/{sub o} U{sup 235} for a single batch loaded, 48-month fuel cycle has been developed. This core achieves an ultra-long cycle length without exceeding current fuel burnup limits. The design uses two different types of burnable poisons. Gadolinium in the form of gadolinium oxide (Gd{sub 2}O{sub 3}) mixed with the UO{sub 2} of selected pins is sued to hold down initial reactivity and to control flux peaking throughout the life of the core. A zirconium di-boride (ZrB{sub 2}) integral fuel burnable absorber (IFBA) coating on the Gd{sub 2}O{sub 3}-UO{sub 2} fuel pellets is added to reduce the critical soluble boron concentration in the reactor coolant to within acceptable limits. Fuel performance issues of concern to this design are also outlined and areas which will require further research are highlighted.

  3. From Reload to ReCourse: Learning from IMS Learning Design Implementations

    ERIC Educational Resources Information Center

    Griffiths, David; Beauvoir, Phillip; Liber, Oleg; Barrett-Baxendale, Mark

    2009-01-01

    The use of the Web to deliver open, distance, and flexible learning has opened up the potential for social interaction and adaptive learning, but the usability, expressivity, and interoperability of the available tools leave much to be desired. This article explores these issues as they relate to teachers and learning designers through the case of…

  4. Whorf Reloaded: Language Effects on Nonverbal Number Processing in First Grade--A Trilingual Study

    ERIC Educational Resources Information Center

    Pixner, S.; Moeller, K.; Hermanova, V.; Nuerk, H. -C.; Kaufmann, L.

    2011-01-01

    The unit-decade compatibility effect is interpreted to reflect processes of place value integration in two-digit number magnitude comparisons. The current study aimed at elucidating the influence of language properties on the compatibility effect of Arabic two-digit numbers in Austrian, Italian, and Czech first graders. The number word systems of…

  5. Optimal reload strategies for identify-and-destroy missions

    NASA Astrophysics Data System (ADS)

    Hyland, John C.; Smith, Cheryl M.

    2004-09-01

    In this problem an identification vehicle must re-acquire a fixed set of suspected targets and determine whether each suspected target is a mine or a false alarm. If a target is determined to be a mine, the identification vehicle must neutralize it by either delivering one of a limited number of on-board bombs or by assigning the neutralization task to one of a limited number of single-shot suicide vehicles. The identification vehicle has the option to reload. The singleshot suicide vehicles, however, cannot be replenished. We have developed an optimal path planning and reload strategy for this identify and destroy mission that takes into account the probabilities that suspected targets are mines, the costs to move between targets, the costs to return to and from the reload point, and the cost to reload. The mission is modeled as a discrete multi-dimensional Markov process. At each target position the vehicle decides based on the known costs, probabilities, the number of bombs on board (r), and the number of remaining one-shot vehicles (s) whether to move directly on to the next target or to reload before continuing and whether to destroy any mine with an on-board bomb or a one-shot suicide vehicle. The approach recursively calculates the minimum expected overall cost conditioned on all possible values r and s. The recursion is similar to dynamic programming in that it starts at the last suspected target location and works its way backwards to the starting point. The approach also uses a suboptimal traveling salesman strategy to search over candidate deployment locations to calculate the best initial deployment point where the reloads will take place.

  6. NASA reload program

    NASA Technical Reports Server (NTRS)

    Byington, Marshall

    1993-01-01

    Atlantic Research Corporation (ARC) contracted with NASA to manufacture and deliver thirteen small scale Solid Rocket Motors (SRM). These motors, containing five distinct propellant formulations, will be used for plume induced radiation studies. The information contained herein summarizes and documents the program accomplishments and results. Several modifications were made to the scope of work during the course of the program. The effort was on hold from late 1991 through August, 1992 while propellant formulation changes were developed. Modifications to the baseline program were completed in late-August and Modification No. 6 was received by ARC on September 14, 1992. The modifications include changes to the propellant formulation and the nozzle design. The required motor deliveries were completed in late-December, 1992. However, ARC agreed to perform an additional mix and cast effort at no cost to NASA and another motor was delivered in March, 1993.

  7. The Heliogyro Reloaded

    NASA Technical Reports Server (NTRS)

    Wilkie, William K.; Warren, Jerry E.; Thompson, M. W.; Lisman, P. D.; Walkemeyer, P. E.; Guerrant, D. V.; Lawrence, D. A.

    2011-01-01

    The heliogyro is a high-performance, spinning solar sail architecture that uses long - order of kilometers - reflective membrane strips to produce thrust from solar radiation pressure. The heliogyro s membrane blades spin about a central hub and are stiffened by centrifugal forces only, making the design exceedingly light weight. Blades are also stowed and deployed from rolls; eliminating deployment and packaging problems associated with handling extremely large, and delicate, membrane sheets used with most traditional square-rigged or spinning disk solar sail designs. The heliogyro solar sail concept was first advanced in the 1960s by MacNeal. A 15 km diameter version was later extensively studied in the 1970s by JPL for an ambitious Comet Halley rendezvous mission, but ultimately not selected due to the need for a risk-reduction flight demonstration. Demonstrating system-level feasibility of a large, spinning heliogyro solar sail on the ground is impossible; however, recent advances in microsatellite bus technologies, coupled with the successful flight demonstration of reflectance control technologies on the JAXA IKAROS solar sail, now make an affordable, small-scale heliogyro technology flight demonstration potentially feasible. In this paper, we will present an overview of the history of the heliogyro solar sail concept, with particular attention paid to the MIT 200-meter-diameter heliogyro study of 1989, followed by a description of our updated, low-cost, heliogyro flight demonstration concept. Our preliminary heliogyro concept (HELIOS) should be capable of demonstrating an order-of-magnitude characteristic acceleration performance improvement over existing solar sail demonstrators (HELIOS target: 0.5 to 1.0 mm/s2 at 1.0 AU); placing the heliogyro technology in the range required to enable a variety of science and human exploration relevant support missions.

  8. Hybrid expert system implementation to determine core reload patterns

    SciTech Connect

    Greek, K.J.; Robinson, A.H.

    1989-01-01

    Determining reactor reload fuel patterns is a computationally intensive problem solving process for which automation can be of significant benefit. Often much effort is expended in the search for an optimal loading. While any modern programming language could be used to automate solution, the specialized tools of artificial intelligence (AI) are the most efficient means of introducing the fuel management expert's knowledge into the search for an optimum reload pattern. Prior research in pressurized water reactor refueling strategies developed FORTRAN programs that automated an expert's basic knowledge to direct a search for an acceptable minimum peak power loading. The dissatisfaction with maintenance of compiled knowledge in FORTRAN programs has served as the motivation for the development of the SHUFFLE expert system. SHUFFLE is written in Smalltalk, an object-oriented programming language, and evaluates loadings as it generates them using a two-group, two-dimensional nodal power calculation compiled in a personal computer-based FORTRAN. This paper reviews the object-oriented representation developed to solve the core reload problem with an expert system tool and its operating prototype, SHUFFLE.

  9. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  10. Reloading Experiment for Aluminum at High Pressure

    NASA Astrophysics Data System (ADS)

    Rili, Hou; Jianxiang, Peng; Jianhua, Zhang; Mingwu, Tu; Ping, Zhou

    2009-06-01

    In the traditional AC method to measure material's dynamic strength, the combination flyer is easy to be delaminated due to shock waves produced in the projectile as a result of sudden application of the projectile driving pressure, which always result in the failure of reloading experiment. The maximum reshock experimental pressure for aluminum presented by Huang and Asay in 2005 is only 22GPa. A technique is described for reloading experiment, by which reloading experiments were performed for 2A12 aluminum alloy shocked to 67.6GPa. In our experiment, the oxygen-free copper and TC4 titanium alloy impactors were used with ultrapure LiF interferometer windows, 2A12 aluminum alloy samples were baked by PMMA buffers, and VISAR was used to measure interface particle velocity. Using an approximate double-step-sample method (two shots with different sample thickness at the same impact velocity), the Lagrange longitudinal velocities along reloading path from initial shock state were obtained, and coupled with unloading experimental data, the bulk velocities were determined, as well as the dynamic yield strength of 2A12 aluminum alloy.

  11. Bone Mineral Density of the Tarsals and Metatarsals With Reloading

    PubMed Central

    Hastings, Mary Kent; Gelber, Judy; Commean, Paul K; Prior, Fred; Sinacore, David R

    2008-01-01

    Background and Purpose: Bone mineral density (BMD) decreases rapidly with prolonged non–weight bearing. Maximizing the BMD response to reloading activities after NWB is critical to minimizing fracture risk. Methods for measuring individual tarsal and metatarsal BMD have not been available. This case report describes tarsal and metatarsal BMD with a reloading program, as revealed by quantitative computed tomography (QCT). Case Description: A 24-year-old woman was non–weight bearing for 6 weeks after right talocrural arthroscopy. Tarsal and metatarsal BMD were measured with QCT 9 weeks (before reloading) and 32 weeks (after reloading) after surgery. A 26-week progressive reloading program was completed. Change scores were calculated for BMD before reloading and BMD after reloading for the total foot (average of all tarsals and metatarsals), tarsals, metatarsals, bones of the medial column (calcaneus, navicular, cuneiforms 1 and 2, and metatarsal 1), and bones of the lateral column (calcaneus, cuboid, cuneiform 3, and metatarsals 2–5). The percent differences in BMD between the involved side and the uninvolved side were calculated. Outcomes: Before reloading, BMD of the involved total foot was 9% lower than that on the uninvolved side. After reloading, BMD increased 22% and 21% for the total foot, 16% and 14% for the tarsals, 29% and 30% for the metatarsals, 14% and 15% for the medial column bones, and 28% and 26% for the lateral column bones on the involved and uninvolved sides, respectively. After reloading, BMD of the involved total foot remained 8% lower than that on the uninvolved side. Discussion: The increase in BMD with reloading was not uniform across all pedal bones; the metatarsals showed a greater increase than the tarsals, and the lateral column bones showed a greater increase than the medial column bones. PMID:18388153

  12. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  13. A Process for Design Engineering

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2004-01-01

    The American Institute of Aeronautics and Astronautics Design Engineering Technical Committee has developed a draft Design Engineering Process with the participation of the technical community. This paper reviews similar engineering activities, lays out common terms for the life cycle and proposes a Design Engineering Process.

  14. Ethylene process design optimization

    SciTech Connect

    2001-09-01

    Integration of Advanced Technologies will Update Ethylene Plants. Nearly 93 million tons of ethylene are produced annually in chemical plants worldwide, using an energy intensive process that consumes 2.5 quadrillion Btu per year.

  15. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  16. Book Processing Facility Design.

    ERIC Educational Resources Information Center

    Sheahan (Drake)-Stewart Dougall, Marketing and Physical Distribution Consultants, New York, NY.

    The Association of New York Libraries for Technical Services (ANYLTS) is established to develop and run a centralized book processing facility for the public library systems in New York State. ANYLTS plans to receive book orders from the 22 library systems, transmit orders to publishers, receive the volumes from the publishers, print and attach…

  17. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  18. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  19. Optimization process in helicopter design

    NASA Technical Reports Server (NTRS)

    Logan, A. H.; Banerjee, D.

    1984-01-01

    In optimizing a helicopter configuration, Hughes Helicopters uses a program called Computer Aided Sizing of Helicopters (CASH), written and updated over the past ten years, and used as an important part of the preliminary design process of the AH-64. First, measures of effectiveness must be supplied to define the mission characteristics of the helicopter to be designed. Then CASH allows the designer to rapidly and automatically develop the basic size of the helicopter (or other rotorcraft) for the given mission. This enables the designer and management to assess the various tradeoffs and to quickly determine the optimum configuration.

  20. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  1. Hydroforming design and process advisor

    SciTech Connect

    Greer, J.T.; Ni, C.M.

    1996-10-10

    The hydroforming process involves hydraulically forming components by conforming them to the inner contours of a die. These contours can be complex and can often cause the material being formed to be stressed to rupture. Considerable process knowledge and materials modeling expertise is required to design hydroform dies and hydroformed parts that are readily formed without being overly stressed. For this CRADA, materials properties for steel tubes subjected to hydraulic stresses were collected; algorithms were developed which combined the materials properties data with process knowledge; and a user friendly graphical interface was utilized to make the system usable by a design engineer. A prototype hydroforming advisor was completed and delivered to GM. The technical objectives of the CRADA were met allowing for the development of an intelligent design systems, prediction of forming properties related to hydroforming, simulation and modeling of process execution, and design optimization. The design advisor allows a rapid and seamless approach to integration an otherwise enormous and onerous task of analysis and evaluation.

  2. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  3. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  4. The maintenance of sex in bacteria is ensured by its potential to reload genes.

    PubMed

    Szöllosi, Gergely J; Derényi, Imre; Vellai, Tibor

    2006-12-01

    Why sex is maintained in nature is a fundamental question in biology. Natural genetic transformation (NGT) is a sexual process by which bacteria actively take up exogenous DNA and use it to replace homologous chromosomal sequences. As it has been demonstrated, the role of NGT in repairing deleterious mutations under constant selection is insufficient for its survival, and the lack of other viable explanations have left no alternative except that DNA uptake provides nucleotides for food. Here we develop a novel simulation approach for the long-term dynamics of genome organization (involving the loss and acquisition of genes) in a bacterial species consisting of a large number of spatially distinct populations subject to independently fluctuating ecological conditions. Our results show that in the presence of weak interpopulation migration NGT is able to subsist as a mechanism to reload locally lost, intermittently selected genes from the collective gene pool of the species through DNA uptake from migrants. Reloading genes and combining them with those in locally adapted genomes allow individual cells to readapt faster to environmental changes. The machinery of transformation survives under a wide range of model parameters readily encompassing real-world biological conditions. These findings imply that the primary role of NGT is not to serve the cell with food, but to provide homologous sequences for restoring genes that have disappeared from or become degraded in the local population. PMID:17028325

  5. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  6. Regulation of proteolysis during reloading of the unweighted soleus muscle.

    PubMed

    Taillandier, Daniel; Aurousseau, Eveline; Combaret, Lydie; Guezennec, Charles-Yannick; Attaix, Didier

    2003-05-01

    There is little information on the mechanisms responsible for muscle recovery following a catabolic condition. To address this point, we reloaded unweighted animals and investigated protein turnover during recovery from this highly catabolic state and the role of proteolysis in the reorganization of the soleus muscle. During early recovery (18 h of reloading) both muscle protein synthesis and breakdown were elevated (+65%, P<0.001 and +22%, P<0.05, respectively). However, only the activation of non-lysosomal and Ca(2+)-independent proteolysis was responsible for increased protein breakdown. Accordingly, mRNA levels for ubiquitin and 20S proteasome subunits C8 and C9 were markedly elevated (from +89 to +325%, P<0.03) and actively transcribed as shown by the analysis of polyribosomal profiles. In contrast, both cathepsin D and 14-kDa-ubiquitin conjugating enzyme E2 mRNA levels decreased, suggesting that the expression of such genes is an early marker of reversed muscle wasting. Following 7 days of reloading, protein synthesis was still elevated and there was no detectable change in protein breakdown rates. Accordingly, mRNA levels for all the proteolytic components tested were back to control values even though an accumulation of high molecular weight ubiquitin conjugates was still detectable. This suggests that soleus muscle remodeling was still going on. Taken together, our observations suggest that enhanced protein synthesis and breakdown are both necessary to recover from muscle atrophy and result in catch-up growth. The observed non-coordinate regulation of proteolytic systems is presumably required to target specific classes of substrates (atrophy-specific protein isoforms, damaged proteins) for replacement and/or elimination. PMID:12672458

  7. Myocardial Reloading after Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    SciTech Connect

    Kajimoto, Masaki; Priddy, Colleen M.; Ledee, Dolena; Xu, Chun; Isern, Nancy G.; Olson, Aaron; Des Rosiers, Christine; Portman, Michael A.

    2013-08-19

    Extracorporeal membrane oxygenation (ECMO) unloads the heart providing a bridge to recovery in children after myocardial stunning. Mortality after ECMO remains high.Cardiac substrate and amino acid requirements upon weaning are unknown and may impact recovery. We assessed the hypothesis that ventricular reloading modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Fourteen immature piglets (7.8-15.6 kg) were separated into 2 groups based on ventricular loading status: 8 hour-ECMO (UNLOAD) and post-wean from ECMO (RELOAD). We infused [2-13C]-pyruvate as an oxidative substrate and [13C6]-L-leucine, as a tracer of amino acid oxidation and protein synthesis into the coronary artery. RELOAD showed marked elevations in myocardial oxygen consumption above baseline and UNLOAD. Pyruvate uptake was markedly increased though RELOAD decreased pyruvate contribution to oxidative CAC metabolism.RELOAD also increased absolute concentrations of all CAC intermediates, while maintaining or increasing 13C-molar percent enrichment. RELOAD also significantly increased cardiac fractional protein synthesis rates by >70% over UNLOAD. Conclusions: RELOAD produced high energy metabolic requirement and rebound protein synthesis. Relative pyruvate decarboxylation decreased with RELOAD while promoting anaplerotic pyruvate carboxylation and amino acid incorporation into protein rather than to the CAC for oxidation. These perturbations may serve as therapeutic targets to improve contractile function after ECMO.

  8. Reloading Continuous GPS in Northwest Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez-Garcia, J. J.; Suarez-Vidal, F.; Gonzalez-Ortega, J. A.

    2007-05-01

    For more than 10 years we try to follow the steps of the Southern California Integrated GPS Network (SCIGN) and the Plate Boundary Observatory (PBO) in USA, this gives us the opportunity to be in position to contribute to develop a modern GPS Network in Mexico. During 1998 and 2001, three stations were deployed in Northwest Mexico in concert with the development of SCIGN: SPMX in north central Baja California state at the National Astronomical Observatory, UNAM in the Sierra San Pedro Martir; CORX in Isla Coronados Sur, offshore San Diego, Ca./Tijuana, Mexico and GUAX in Guadalupe island 150 miles offshore Baja California peninsula, which provide a unique site on the Pacific plate in the Northamerica/Pacific boundary zone in Las Californias. The former IGS station in CICESE, Ensenada, CICE installed in 1995, was replaced by CIC1 in 1999. In 2004 and 2005 with partial support from SCIGN and UNAVCO to University of Arizona a volunteer team from UNAVCO, Caltech, U.S. Geological Survey, Universidad de la Sierra at Moctezuma Sonora and CICESE built two new shallow-braced GPS sites in northwest Mexico. The first site USMX is located at east-central Sonora and the second YESX is located high in the Sierra Madre Occidental at Yecora near the southern border of Sonora and Chihuahua. All data is openly available at SOPAC and/or UNAVCO. The existing information has been valuable to resolve the "total" plate motion between the Pacific plate (GUAX) and the Northamerica plate (USMX and YESX) in the north- central Gulf of California. Since the last year we have the capability of GPS data processing using GAMIT/GLOBK, and after gain some practice with survey mode data processing we can convert us in a GPS processing center in Mexico. Currently only 2 sites are operational: CIC1 and USMX. With new energy we are ready to contribute to the establishment of a modern GPS network in Mexico for science, hazard monitoring and infrastructure.

  9. Digital Earth reloaded - Beyond the next generation

    NASA Astrophysics Data System (ADS)

    Ehlers, M.; Woodgate, P.; Annoni, A.; Schade, S.

    2014-02-01

    Digital replicas (or 'mirror worlds') of complex entities and systems are now routine in many fields such as aerospace engineering; archaeology; medicine; or even fashion design. The Digital Earth (DE) concept as a digital replica of the entire planet occurs in Al Gore's 1992 book Earth in the Balance and was popularized in his speech at the California Science Center in January 1998. It played a pivotal role in stimulating the development of a first generation of virtual globes, typified by Google Earth that achieved many elements of this vision. Almost 15 years after Al Gore's speech, the concept of DE needs to be re-evaluated in the light of the many scientific and technical developments in the fields of information technology, data infrastructures, citizen?s participation, and earth observation that have taken place since. This paper intends to look beyond the next generation predominantly based on the developments of fields outside the spatial sciences, where concepts, software, and hardware with strong relationships to DE are being developed without referring to this term. It also presents a number of guiding criteria for future DE developments.

  10. The Snark was a Boojum - reloaded.

    PubMed

    Macrì, Simone; Richter, S Helene

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 ("The Snark was a Boojum"). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of "translational research" that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that developing

  11. The Snark was a Boojum - reloaded

    PubMed Central

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 (“The Snark was a Boojum”). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of “translational research” that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that

  12. GAX absorption cycle design process

    SciTech Connect

    Priedeman, D.K.; Christensen, R.N.

    1999-07-01

    This paper presents an absorption system design process that relies on computer simulations that are validated by experimental findings. An ammonia-water absorption heat pump cycle at 3 refrigeration tons (RT) and chillers at 3.3 RT and 5 RT (10.5 kW, 11.6 kW, and 17.6 kW) were initially modeled and then built and tested. The experimental results were used to calibrate both the cycle simulation and the component simulations, yielding computer design routines that could accurately predict component and cycle performance. Each system was a generator-absorber heat exchange (GAX) cycle, and all were sized for residential and light commercial use, where very little absorption equipment is currently used. The specific findings of the 5 RT (17.6 kW) chiller are presented. Modeling incorporated a heat loss from the gas-fired generator and pressure drops in both the evaporator and absorber. Simulation results and experimental findings agreed closely and validated the modeling method and simulation software.

  13. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  14. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  15. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  16. Development of Innovative Design Processor

    SciTech Connect

    Park, Y.S.; Park, C.O.

    2004-07-01

    The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which is another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)

  17. Improvement of characteristic statistic algorithm and its application on equilibrium cycle reloading optimization

    SciTech Connect

    Hu, Y.; Liu, Z.; Shi, X.; Wang, B.

    2006-07-01

    A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)

  18. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  19. 76 FR 70368 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... USDA Secretarial disaster designation process. FSA proposes to simplify the processes and delegate them... rule would update the language to reflect current practice. The current regulations require that a... proposes to simplify the USDA Secretarial designation process from a six-step process to a two-step...

  20. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  1. Instructional Design Processes and Traditional Colleges

    ERIC Educational Resources Information Center

    Vasser, Nichole

    2010-01-01

    Traditional colleges who have implemented distance education programs would benefit from using instructional design processes to develop their courses. Instructional design processes provide the framework for designing and delivering quality online learning programs in a highly-competitive educational market. Traditional college leaders play a…

  2. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  3. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... designation regulations to provide for changes in the designation process (76 FR 70368-70374). In general, that rule proposed to simplify the disaster designation process and to delegate the authority for... 759.6 has also been changed from the proposed rule to remove proposed language referring to a...

  4. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  5. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  6. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  7. NANEX: Process design and optimization.

    PubMed

    Baumgartner, Ramona; Matić, Josip; Schrank, Simone; Laske, Stephan; Khinast, Johannes; Roblegg, Eva

    2016-06-15

    Previously, we introduced a one-step nano-extrusion (NANEX) process for transferring aqueous nano-suspensions into solid formulations directly in the liquid phase. Nano-suspensions were fed into molten polymers via a side-feeding device and excess water was eliminated via devolatilization. However, the drug content in nano-suspensions is restricted to 30 % (w/w), and obtaining sufficiently high drug loadings in the final formulation requires the processing of high water amounts and thus a fundamental process understanding. To this end, we investigated four polymers with different physicochemical characteristics (Kollidon(®) VA64, Eudragit(®) E PO, HPMCAS and PEG 20000) in terms of their maximum water uptake/removal capacity. Process parameters as throughput and screw speed were adapted and their effect on the mean residence time and filling degree was studied. Additionally, one-dimensional discretization modeling was performed to examine the complex interactions between the screw geometry and the process parameters during water addition/removal. It was established that polymers with a certain water miscibility/solubility can be manufactured via NANEX. Long residence times of the molten polymer in the extruder and low filling degrees in the degassing zone favored the addition/removal of significant amounts of water. The residual moisture content in the final extrudates was comparable to that of extrudates manufactured without water. PMID:27090153

  8. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  9. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  10. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  11. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  12. Reloading partly recovers bone mineral density and mechanical properties in hind limb unloaded rats

    NASA Astrophysics Data System (ADS)

    Zhao, Fan; Li, Dijie; Arfat, Yasir; Chen, Zhihao; Liu, Zonglin; Lin, Yu; Ding, Chong; Sun, Yulong; Hu, Lifang; Shang, Peng; Qian, Airong

    2014-12-01

    Skeletal unloading results in decreased bone formation and bone mass. During long-term space flight, the decreased bone mass is impossible to fully recover. Therefore, it is necessary to develop the effective countermeasures to prevent spaceflight-induced bone loss. Hindlimb Unloading (HLU) simulates effects of weightlessness and is utilized extensively to examine the response of musculoskeletal systems to certain aspects of space flight. The purpose of this study is to investigate the effects of a 4-week HLU in rats and subsequent reloading on the bone mineral density (BMD) and mechanical properties of load-bearing bones. After HLU for 4 weeks, the rats were then subjected to reloading for 1 week, 2 weeks and 3 weeks, and then the BMD of the femur, tibia and lumbar spine in rats were assessed by dual energy X-ray absorptiometry (DXA) every week. The mechanical properties of the femur were determined by three-point bending test. Dry bone and bone ash of femur were obtained through Oven-Drying method and were weighed respectively. Serum alkaline phosphatase (ALP) and serum calcium were examined through ELISA and Atomic Absorption Spectrometry. The results showed that 4 weeks of HLU significantly decreased body weight of rats and reloading for 1 week, 2 weeks or 3 weeks did not recover the weight loss induced by HLU. However, after 2 weeks of reloading, BMD of femur and tibia of HLU rats partly recovered (+10.4%, +2.3%). After 3 weeks of reloading, the reduction of BMD, energy absorption, bone mass and mechanical properties of bone induced by HLU recovered to some extent. The changes in serum ALP and serum calcium induced by HLU were also recovered after reloading. Our results indicate that a short period of reloading could not completely recover bone after a period of unloading, thus some interventions such as mechanical vibration or pharmaceuticals are necessary to help bone recovery.

  13. Muscle regeneration during hindlimb unloading results in a reduction in muscle size after reloading

    NASA Technical Reports Server (NTRS)

    Mozdziak, P. E.; Pulvermacher, P. M.; Schultz, E.

    2001-01-01

    The hindlimb-unloading model was used to study the ability of muscle injured in a weightless environment to recover after reloading. Satellite cell mitotic activity and DNA unit size were determined in injured and intact soleus muscles from hindlimb-unloaded and age-matched weight-bearing rats at the conclusion of 28 days of hindlimb unloading, 2 wk after reloading, and 9 wk after reloading. The body weights of hindlimb-unloaded rats were significantly (P < 0.05) less than those of weight-bearing rats at the conclusion of hindlimb unloading, but they were the same (P > 0.05) as those of weight-bearing rats 2 and 9 wk after reloading. The soleus muscle weight, soleus muscle weight-to-body weight ratio, myofiber diameter, number of nuclei per millimeter, and DNA unit size were significantly (P < 0.05) smaller for the injured soleus muscles from hindlimb-unloaded rats than for the soleus muscles from weight-bearing rats at each recovery time. Satellite cell mitotic activity was significantly (P < 0.05) higher in the injured soleus muscles from hindlimb-unloaded rats than from weight-bearing rats 2 wk after reloading, but it was the same (P > 0.05) as in the injured soleus muscles from weight-bearing rats 9 wk after reloading. The injured soleus muscles from hindlimb-unloaded rats failed to achieve weight-bearing muscle size 9 wk after reloading, because incomplete compensation for the decrease in myonuclear accretion and DNA unit size expansion occurred during the unloading period.

  14. WORKSHOP ON ENVIRONMENTALLY CONSCIOUS CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    To encourage the consideration of environmental issues during chemical process design, the USEPA has developed techniques and software tools to evaluate the relative environmental impact of a chemical process. These techniques and tools aid in the risk management process by focus...

  15. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  16. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  17. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  18. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  19. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  20. Distributed Group Design Process: Lessons Learned.

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ganesan, Radha

    A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…

  1. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  2. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  3. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  4. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  5. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  6. Temporal changes in sarcomere lesions of rat adductor longus muscles during hindlimb reloading

    NASA Technical Reports Server (NTRS)

    Krippendorf, B. B.; Riley, D. A.

    1994-01-01

    Focal sarcomere disruptions were previously observed in adductor longus muscles of rats flown approximately two weeks aboard the Cosmos 1887 and 2044 biosatellite flights. These lesions, characterized by breakage and loss of myofilaments and Z-line streaming, resembled damage induced by unaccustomed exercise that includes eccentric contractions in which muscles lengthen as they develop tension. We hypothesized that sarcomere lesions in atrophied muscles of space flow rats were not produced in microgravity by muscle unloading but resulted from muscle reloading upon re-exposure to terrestrial gravity. To test this hypothesis, we examined temporal changes in sarcomere integrity of adductor longus muscles from rats subjected to 12.5 days of hindlimb suspension unloading and subsequent reloading by return to vivarium cages for 0, 6, 12, or 48 hours of normal weightbearing. Our ultrastructural observations suggested that muscle unloading (0 h reloading) induced myofibril misalignment associated with myofiber atrophy. Muscle reloading for 6 hours induced focal sarcomere lesions in which cross striations were abnormally widened. Such lesions were electron lucent due to extensive myofilament loss. Lesions in reloaded muscles showed rapid restructuring. By 12 hours of reloading, lesions were moderately stained foci and by 48 hours darkly stained foci in which the pattern of cross striations was indistinct at the light and electron microscopic levels. These lesions were spanned by Z-line-like electron dense filamentous material. Our findings suggest a new role for Z-line streaming in lesion restructuring: rather than an antecedent to damage, this type of Z-line streaming may be indicative of rapid, early sarcomere repair.

  7. Intracellular Ca2+ transients in mouse soleus muscle after hindlimb unloading and reloading

    NASA Technical Reports Server (NTRS)

    Ingalls, C. P.; Warren, G. L.; Armstrong, R. B.; Hamilton, S. L. (Principal Investigator)

    1999-01-01

    The objective of this study was to determine whether altered intracellular Ca(2+) handling contributes to the specific force loss in the soleus muscle after unloading and/or subsequent reloading of mouse hindlimbs. Three groups of female ICR mice were studied: 1) unloaded mice (n = 11) that were hindlimb suspended for 14 days, 2) reloaded mice (n = 10) that were returned to their cages for 1 day after 14 days of hindlimb suspension, and 3) control mice (n = 10) that had normal cage activity. Maximum isometric tetanic force (P(o)) was determined in the soleus muscle from the left hindlimb, and resting free cytosolic Ca(2+) concentration ([Ca(2+)](i)), tetanic [Ca(2+)](i), and 4-chloro-m-cresol-induced [Ca(2+)](i) were measured in the contralateral soleus muscle by confocal laser scanning microscopy. Unloading and reloading increased resting [Ca(2+)](i) above control by 36% and 24%, respectively. Although unloading reduced P(o) and specific force by 58% and 24%, respectively, compared with control mice, there was no difference in tetanic [Ca(2+)](i). P(o), specific force, and tetanic [Ca(2+)](i) were reduced by 58%, 23%, and 23%, respectively, in the reloaded animals compared with control mice; however, tetanic [Ca(2+)](i) was not different between unloaded and reloaded mice. These data indicate that although hindlimb suspension results in disturbed intracellular Ca(2+) homeostasis, changes in tetanic [Ca(2+)](i) do not contribute to force deficits. Compared with unloading, 24 h of physiological reloading in the mouse do not result in further changes in maximal strength or tetanic [Ca(2+)](i).

  8. Myocardial Reloading After Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    PubMed Central

    Kajimoto, Masaki; O'Kelly Priddy, Colleen M.; Ledee, Dolena R.; Xu, Chun; Isern, Nancy; Olson, Aaron K.; Rosiers, Christine Des; Portman, Michael A.

    2013-01-01

    Background Extracorporeal membrane oxygenation (ECMO) unloads the heart, providing a bridge to recovery in children after myocardial stunning. ECMO also induces stress which can adversely affect the ability to reload or wean the heart from the circuit. Metabolic impairments induced by altered loading and/or stress conditions may impact weaning. However, cardiac substrate and amino acid requirements upon weaning are unknown. We assessed the hypothesis that ventricular reloading with ECMO modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Methods and Results Sixteen immature piglets (7.8 to 15.6 kg) were separated into 2 groups based on ventricular loading status: 8‐hour ECMO (UNLOAD) and postwean from ECMO (RELOAD). We infused into the coronary artery [2‐13C]‐pyruvate as an oxidative substrate and [13C6]‐L‐leucine as an indicator for amino acid oxidation and protein synthesis. Upon RELOAD, each functional parameter, which were decreased substantially by ECMO, recovered to near‐baseline level with the exclusion of minimum dP/dt. Accordingly, myocardial oxygen consumption was also increased, indicating that overall mitochondrial metabolism was reestablished. At the metabolic level, when compared to UNLOAD, RELOAD altered the contribution of various substrates/pathways to tissue pyruvate formation, favoring exogenous pyruvate versus glycolysis, and acetyl‐CoA formation, shifting away from pyruvate decarboxylation to endogenous substrate, presumably fatty acids. Furthermore, there was also a significant increase of tissue concentrations for all CAC intermediates (≈80%), suggesting enhanced anaplerosis, and of fractional protein synthesis rates (>70%). Conclusions RELOAD alters both cytosolic and mitochondrial energy substrate metabolism, while favoring leucine incorporation into protein synthesis rather than oxidation in the CAC. Improved understanding of factors governing these metabolic perturbations may

  9. Macrocell design for concurrent signal processing

    SciTech Connect

    Pope, S.P.; Brodersen, R.W.

    1983-01-01

    Macrocells serve as subsystems at the top level of the hardware design hierarchy. The authors present the macrocell design technique as applied to the implementation of real-time, sampled-data signal processing functions. The design of such circuits is particularly challenging due to the computationally intensive nature of signal-processing algorithms and the constraints of real-time operation. The most efficient designs make use of a high degree of concurrency-a property facilitated by the microcell approach. Two circuit projects whose development resulted largely from the macrocell methodology described are used as examples throughout the report: a linear-predictive vocoder circuit, and a front-end filter-bank chip for a speech recognition system. Both are monolithic multiprocessor implementations: the lpc vocoder circuit contains three processors, the filter-bank chip two processors. 10 references.

  10. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  11. Teaching sustainable design: A collaborative process

    SciTech Connect

    Theis, C.C.

    1997-12-31

    This paper describes a collaborative educational experience in the Schools of Architecture and Landscape Architecture at Louisiana State University. During the Fall Semester of 1996 an upper-level architectural design studio worked with a peer group of landscape architecture students on the design of a master plan for an environmentally sensitive residential development on Cat Island, a barrier island located approximately eight miles south of Gulfport, Mississippi. This paper presents the methodology and results of the project, describes the collaborative process, and assesses both the viability of the design solutions and the value of the educational experience.

  12. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  13. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  14. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  15. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  16. Using scoping as a design process

    SciTech Connect

    Mulvihill, P.R. ); Jacobs, P. )

    1998-07-01

    Skillful use of the scoping phase of environment assessment (EA) is critical in cases involving a wide diversity of stakeholders and perspectives. Scoping can exert a strong influence in shaping a relevant impact assessment and increasing the probability of a process that satisfies stakeholders. This article explores key challenges facing scoping processes conducted in highly pluralistic settings. Elements of a notable case study--the scoping process conducted in 1992 for the proposed Great Whale Hydroelectric project in Northern Quebec--are discussed to illustrate innovative approaches. When used as a design process, scoping can ensure that EA reflects the different value sets and cultures that are at play, particularly where diverse knowledge systems and ways of describing environmental components and impacts exist. As it sets the stage for subsequent steps in the EA process, scoping needs to be a sufficiently broad umbrella that accommodates diverse approaches to identifying, classifying, and assessing impacts.

  17. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  18. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  19. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  20. Multiwavelet design for cardiac signal processing.

    PubMed

    Peelers, R L M; Karel, J M H; Westra, R L; Haddad, S A P; Serdijn, W A

    2006-01-01

    An approach for designing multiwavelets is introduced, for use in cardiac signal processing. The parameterization of the class of multiwavelets is in terms of associated FIR polyphase all-pass filters. Orthogonality and a balanced vanishing moment of order 1 are built into the parameterization. An optimization criterion is developed to associate the wavelets with different meaningful segments of a signal. This approach is demonstrated on the simultaneous detection of QRS-complexes and T-peaks in ECG signals. PMID:17946917

  1. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  2. Thinking and the Design Process. DIUL-RR-8414.

    ERIC Educational Resources Information Center

    Moulin, Bernard

    Designed to focus attention on the design process in such computer science activities as information systems design, database design, and expert systems design, this paper examines three main phases of the design process: understanding the context of the problem, identifying the problem, and finding a solution. The processes that these phases…

  3. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  4. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  5. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition

    PubMed Central

    Imbir, Kamil K.

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples. PMID:27486423

  6. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  7. Innovative machine designs for radiation processing

    NASA Astrophysics Data System (ADS)

    Vroom, David

    2007-12-01

    In the 1990s Raychem Corporation established a program to investigate the commercialization of several promising applications involving the combined use of its core competencies in materials science, radiation chemistry and e-beam radiation technology. The applications investigated included those that would extend Raychem's well known heat recoverable polymer and wire and cable product lines as well as new potential applications such as remediation of contaminated aqueous streams. A central part of the program was the development of new accelerator technology designed to improve quality, lower processing costs and efficiently process conformable materials such at liquids. A major emphasis with this new irradiation technology was to look at the accelerator and product handling systems as one integrated, not as two complimentary systems.

  8. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  9. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  10. Effects of Unloading and Reloading on Expressions of Skelatal Muscle Membrane Proteins in Mice

    NASA Astrophysics Data System (ADS)

    Ohno, Y.; Ikuta, A.; Goto, A.; Sugiura, T.; Ohira, Y.; Yoshioka, T.; Goto, K.

    2013-02-01

    Effects of unloading and reloading on the expression levels of tripartite motif-containing 72 (TRIM72) and caveolin-3 (Cav-3) of soleus muscle in mice were investigated. Male C57BL/6J mice (11-week old) were randomly assigned to control and hindlimb-suspended groups. Some of mice in hindlimb-suspended group were subjected to continuous hindlimb suspension (HS) for 2 weeks with or without 7 days of ambulation recovery. Following HS, the muscle weight and protein expression levels of TRIM72 and Cav-3 in soleus were decreased. On the other hand, the gradual increases in muscle mass, TRIM72 and Cav-3 were observed after reloading following HS. Therefore, it was suggested that mechanical loading played a key role in a regulatory system for protein expressions of TRIM72 and Cav-3.

  11. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  12. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  13. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  14. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is the…

  15. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  16. PROCESS DESIGN MANUAL FOR STRIPPING OF ORGANICS

    EPA Science Inventory

    Procedures and correlations for designing and costing stripping towers for the removal of organics from aqueous streams are presented. The emphasis is on practical methods suitable for engineering estimates. The designs cover steam strippers with and without condensers and reflux...

  17. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  18. Memory reloaded: memory load effects in the attentional blink.

    PubMed

    Visser, Troy A W

    2010-06-01

    When two targets are presented in rapid succession, identification of the first is nearly perfect, while identification of the second is impaired when it follows the first by less than about 700 ms. According to bottleneck models, this attentional blink (AB) occurs because the second target is unable to gain access to capacity-limited working memory processes already occupied by the first target. Evidence for this hypothesis, however, has been mixed, with recent reports suggesting that increasing working memory load does not affect the AB. The present paper explores possible reasons for failures to find a link between memory load and the AB and shows that a reliable effect of load can be obtained when the item directly after T1 (Target 1) is omitted. This finding provides initial evidence that working memory load can influence the AB and additional evidence for a link between T1 processing time and the AB predicted by bottleneck models. PMID:19787551

  19. A survey of the Oyster Creek reload licensing model

    SciTech Connect

    Alammar, M.A. )

    1991-01-01

    The Oyster Creek RETRAN licensing model was submitted for approval by the U.S. Nuclear Regulatory Commission in September 1987. This paper discusses the technical issues and concerns that were raised during the review process and how they were resolved. The technical issues are grouped into three major categories: the adequacy of the model benchmark against plant data; uncertainty analysis and model convergence with respect to various critical parameters (code correlations, nodalization, time step, etc.); and model application and usage.

  20. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  1. Shock/reload response of water and aqueous solutions of ammonium nitrate

    NASA Astrophysics Data System (ADS)

    Morley, Mike; Williamson, David

    2011-06-01

    The response of water and aqueous solutions of ammonium nitrate to shock loading, below 10 GPa, has been experimentally investigated. In addition to determination of the principal Hugoniot, equation of state data have been measured through ``shock/reload'' experiments using a gas-gun driven plate-impact. A Mie-Grüneisen type equation of state has been applied to the liquids under investigation. The effects of initial temperature, and of weight-percentage of ammonium nitrate, on the volume-dependent Grüneisen parameter are reported.

  2. The influence of peak shock stress on the quasi-static reload response of HCP metals

    SciTech Connect

    Cerreta, E. K.; Gray, G. T. III; Trujillo, C. P.; Brown, D. W.; Tome, C. N.

    2007-12-12

    Textured, high-purity hafnium has been shock loaded at 5 and 11 GPa, below the pressure reported for the {alpha}{open_square}{omega} phase transformation, 23 GPa. The specimens were 'soft caught' for post-shock characterization. Substructure of the shocked materials was investigated through transmission electron microscopy and texture evolution due to shock loading was probed with neutron diffraction. The deformation behavior of as-annealed hafnium under quasi-static conditions was compared to its response following shock prestraining. Reload response was correlated to defect generation and storage due to shock loading and compared with observations in other HCP metals such as Ti and Zr.

  3. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  4. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  5. Space bioreactor: Design/process flow

    NASA Technical Reports Server (NTRS)

    Cross, John H.

    1987-01-01

    The design of the space bioreactor stems from three considerations. First, and foremost, it must sustain cells in microgravity. Closely related is the ability to take advantage of the weightlessness and microgravity. Lastly, it should fit into a bioprocess. The design of the space bioreactor is described in view of these considerations. A flow chart of the bioreactor is presented and discussed.

  6. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  7. Moral judgment reloaded: a moral dilemma validation study

    PubMed Central

    Christensen, Julia F.; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K.; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set. PMID:25071621

  8. SETI reloaded: Next generation radio telescopes, transients and cognitive computing

    NASA Astrophysics Data System (ADS)

    Garrett, Michael A.

    2015-08-01

    The Search for Extra-terrestrial Intelligence (SETI) using radio telescopes is an area of research that is now more than 50 years old. Thus far, both targeted and wide-area surveys have yet to detect artificial signals from intelligent civilisations. In this paper, I argue that the incidence of co-existing intelligent and communicating civilisations is probably small in the Milky Way. While this makes successful SETI searches a very difficult pursuit indeed, the huge impact of even a single detection requires us to continue the search. A substantial increase in the overall performance of radio telescopes (and in particular future wide-field instruments such as the Square Kilometre Array - SKA), provide renewed optimism in the field. Evidence for this is already to be seen in the success of SETI researchers in acquiring observations on some of the world's most sensitive radio telescope facilities via open, peer-reviewed processes. The increasing interest in the dynamic radio sky, and our ability to detect new and rapid transient phenomena such as Fast Radio Bursts (FRB) is also greatly encouraging. While the nature of FRBs is not yet fully understood, I argue they are unlikely to be the signature of distant extra-terrestrial civilisations. As astronomers face a data avalanche on all sides, advances made in related areas such as advanced Big Data analytics, and cognitive computing are crucial to enable serendipitous discoveries to be made. In any case, as the era of the SKA fast approaches, the prospects of a SETI detection have never been better.

  9. Study on Product Innovative Design Process Driven by Ideal Solution

    NASA Astrophysics Data System (ADS)

    Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui

    Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.

  10. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  11. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  12. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  13. Information Architecture without Internal Theory: An Inductive Design Process.

    ERIC Educational Resources Information Center

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  14. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  15. Optimality criteria design and stress constraint processing

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1982-01-01

    Methods for pre-screening stress constraints into either primary or side-constraint categories are reviewed; a projection method, which is developed from prior cycle stress resultant history, is introduced as an additional screening parameter. Stress resultant projections are also employed to modify the traditional stress-ratio, side-constraint boundary. A special application of structural modification reanalysis is applied to the critical stress constraints to provide feasible designs that are preferable to those obtained by conventional scaling. Sample problem executions show relatively short run times and fewer design cycle iterations to achieve low structural weights; those attained are comparable to the minimum values developed elsewhere.

  16. Erlang Behaviours: Programming with Process Design Patterns

    NASA Astrophysics Data System (ADS)

    Cesarini, Francesco; Thompson, Simon

    Erlang processes run independently of each other, each using separate memory and communicating with each other by message passing. These processes, while executing different code, do so following a number of common patterns. By examining different examples of Erlang-style concurrency in client/server architectures, we identify the generic and specific parts of the code and extract the generic code to form a process skeleton. In Erlang, the most commonly used patterns have been implemented in library modules, commonly referred to as OTP behaviours. They contain the generic code framework for concurrency and error handling, simplifying the complexity of concurrent programming and protecting the developer from many common pitfalls.

  17. Design Science Research for Business Process Design: Organizational Transition at Intersport Sweden

    NASA Astrophysics Data System (ADS)

    Lind, Mikael; Rudmark, Daniel; Seigerroth, Ulf

    Business processes need to be aligned with business strategies. This paper elaborates on experiences from a business process design effort in an action research project performed at Intersport Sweden. The purpose with this project was to create a solid base for taking the retail chain Intersport into a new organizational state where the new process design is aligned with strategic goals. Although business process modeling is concerned with creating artifacts, traditionally information systems design science research has had little impact on research on business process models. In this paper, we address the question of how design science research can contribute to business process design. Three heuristic guidelines for creating organizational commitment and strategic alignment in process design are presented. The guidelines are derived from the successful actions taken in the research project. The development of these guidelines is used as a basis to reflect upon the contribution of design science research to business process design.

  18. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  19. Biochemical Engineering. Part II: Process Design

    ERIC Educational Resources Information Center

    Atkinson, B.

    1972-01-01

    Describes types of industrial techniques involving biochemical products, specifying the advantages and disadvantages of batch and continuous processes, and contrasting biochemical and chemical engineering. See SE 506 318 for Part I. (AL)

  20. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught at…

  1. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  2. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Video Gallery

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  3. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  4. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  5. INTEGRATION OF SYSTEMS ENGINEERING AND PROCESS INTENSIFICATION IN THE DESIGN OF PROCESSES FOR UTILIZING BIOBASED GLYCEROL

    EPA Science Inventory

    The expected results include an integrated process and mechanical design including a fabrication plan for the glycerol dehydration reactor, comprehensive heat and material balance, environmental impact assessment and comprehensive safety review. The resulting process design w...

  6. Algorithmic Processes for Increasing Design Efficiency.

    ERIC Educational Resources Information Center

    Terrell, William R.

    1983-01-01

    Discusses the role of algorithmic processes as a supplementary method for producing cost-effective and efficient instructional materials. Examines three approaches to problem solving in the context of developing training materials for the Naval Training Command: application of algorithms, quasi-algorithms, and heuristics. (EAO)

  7. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  8. Jemboss reloaded.

    PubMed

    Mullan, Lisa

    2004-06-01

    Bioinformatics tools are freely available from websites all over the world. Often they are presented as web services, although there are many tools for download and use on a local machine. This tutorial section looks at Jemboss, a Java-based graphical user interface (GUI) for the EMBOSS bioinformatics suite, which combines the advantages of both web service and downloaded software. PMID:15260898

  9. Rolling Reloaded

    ERIC Educational Resources Information Center

    Jones, Simon A.; Nieminen, John M.

    2008-01-01

    Not so long ago a new observation about rolling motion was described: for a rolling wheel, there is a set of points with instantaneous velocities directed at or away from the centre of the wheel; these points form a circle whose diameter connects the centre of the wheel to the wheel's point of contact with the ground (Sharma 1996 "Eur. J. Phys."…

  10. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  11. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  12. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  13. MIDAS: a framework for integrated design and manufacturing process

    NASA Astrophysics Data System (ADS)

    Chung, Moon Jung; Kwon, Patrick; Pentland, Brian

    2000-10-01

    In this paper, we present a development of a framework for managing design and manufacturing process in a distributed environment. The framework offers the following facilities: (1) to represent the complicated engineering design processes (2) to coordinate design activities and execute the process in a distributed environment and (3) to support a collaborative design by sharing data and processes. In this paper, the process flow graphs, which consist in tasks and the corresponding input and output data, are used to depict the engineering design process on a process modeling browser. The engineering activities in the represented processes can be executed in a distributed environment through the cockpit of the framework. The communication among the related engineers to support a collaborative design is made on the collaborative design browser with SML underlying data structures of representing process information to make the framework extensible and platform- independent. The formal and flexible approach of the proposed framework to integrate the engineering design processes can be also effectively applied to coordinate concurrent engineering activities in a distributed environment.

  14. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  15. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  16. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  17. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Diverting of shipments, breaking of... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to Regional Director. (a) Shipments of inspected and passed product that bear...

  18. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  19. Experimental design for improved ceramic processing, emphasizing the Taguchi Method

    SciTech Connect

    Weiser, M.W. . Mechanical Engineering Dept.); Fong, K.B. )

    1993-12-01

    Ceramic processing often requires substantial experimentation to produce acceptable product quality and performance. This is a consequence of ceramic processes depending upon a multitude of factors, some of which can be controlled and others that are beyond the control of the manufacturer. Statistical design of experiments is a procedure that allows quick, economical, and accurate evaluation of processes and products that depend upon several variables. Designed experiments are sets of tests in which the variables are adjusted methodically. A well-designed experiment yields unambiguous results at minimal cost. A poorly designed experiment may reveal little information of value even with complex analysis, wasting valuable time and resources. This article will review the most common experimental designs. This will include both nonstatistical designs and the much more powerful statistical experimental designs. The Taguchi Method developed by Grenichi Taguchi will be discussed in some detail. The Taguchi method, based upon fractional factorial experiments, is a powerful tool for optimizing product and process performance.

  20. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  1. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates. PMID:27088667

  2. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  3. Process Design Manual for Land Treatment of Municipal Wastewater.

    ERIC Educational Resources Information Center

    Crites, R.; And Others

    This manual presents a procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are given emphasis. The basic unit operations and unit processes are discussed in detail, and the design concepts and criteria are presented. The manual includes design…

  4. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  5. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  6. Design of smart imagers with image processing

    NASA Astrophysics Data System (ADS)

    Serova, Evgeniya N.; Shiryaev, Yury A.; Udovichenko, Anton O.

    2005-06-01

    This paper is devoted to creation of novel CMOS APS imagers with focal plane parallel image preprocessing for smart technical vision and electro-optical systems based on neural implementation. Using analysis of main biological vision features, the desired artificial vision characteristics are defined. Image processing tasks can be implemented by smart focal plane preprocessing CMOS imagers with neural networks are determined. Eventual results are important for medicine, aerospace ecological monitoring, complexity, and ways for CMOS APS neural nets implementation. To reduce real image preprocessing time special methods based on edge detection and neighbored frame subtraction will be considered and simulated. To select optimal methods and mathematical operators for edge detection various medical, technical and aerospace images will be tested. The important research direction will be devoted to analogue implementation of main preprocessing operations (addition, subtraction, neighbored frame subtraction, module, and edge detection of pixel signals) in focal plane of CMOS APS imagers. We present the following results: the algorithm of edge detection for analog realization, and patented focal plane circuits for analog image reprocessing (edge detection and motion detection).

  7. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  8. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  9. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  10. Integration of MGDS design into the licensing process

    SciTech Connect

    1997-12-01

    This paper presents an overview of how the Mined Geologic Disposal System (MGDS) design for a potential repository is integrated into the licensing process. The integration process employs a two-told approach: (1) ensure that the MGDS design complies with applicable Nuclear Regulatory Commission (NRC) licensing requirements, and (2) ensure that the MGDS design is appropriately reflected in a license application that is acceptable to the NRC for performing acceptance and compliance reviews.

  11. 32nm design rule and process exploration flow

    NASA Astrophysics Data System (ADS)

    Zhang, Yunqiang; Cobb, Jonathan; Yang, Amy; Li, Ji; Lucas, Kevin; Sethi, Satyendra

    2008-10-01

    Semiconductor manufacturers spend hundreds of millions of dollars and years of development time to create a new manufacturing process and to design frontrunner products to work on the new process. A considerable percentage of this large investment is aimed at producing the process design rules and related lithography technology to pattern the new products successfully. Significant additional cost and time is needed in both process and design development if the design rules or lithography strategy must be modified. Therefore, early and accurate prediction of both process design rules and lithography options is necessary for minimizing cost and timing in semiconductor development. This paper describes a methodology to determine the optimum design rules and lithography conditions with high accuracy early in the development lifecycle. We present results from the 32nm logic node but the methodology can be extended to the 22nm node or any other node. This work involves: automated generation of extended realistic logic test layouts utilizing programmed teststructures for a variety of design rules; determining a range of optical illumination and process conditions to test for each critical design layer; using these illumination conditions to create a extrapolatable process window OPC model which is matched to rigorous TCAD lithography focus-exposure full chemically amplified resist models; creating reticle enhancement technique (RET) recipes which are flexible enough to be used over a variety of design rule and illumination conditions; OPC recipes which are flexible enough to be used over a variety of design rule and illumination conditions; and OPC verification to find, categorize and report all patterning issues found in the different design and illumination variations. In this work we describe in detail the individual steps in the methodology, and provide results of its use for 32nm node design rule and process optimization.

  12. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified. PMID:20589669

  13. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  14. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  15. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. PMID:24616438

  16. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  17. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, S.D.

    1998-07-01

    The sequential manner in which materials and processes for a manufactured product are selected is inherently less than optimal. Designers` tendency to choose processes and materials with which they are familiar exacerbate this problem. A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach is presented.

  18. Dynamic Characteristics Analysis of Analogue Networks Design Process

    NASA Astrophysics Data System (ADS)

    Zemliak, Alexander M.

    The process of designing analogue circuits is formulated as a controlled dynamic system. For analysis of such system's properties it is suggested to use the concept of Lyapunov's function for a dynamic system. Various forms of Lyapunov's function are suggested. Analyzing the behavior of Lyapunov's function and its first derivative allowed us to determine significant correlation between this function's properties and processor time used to design the circuit. Numerical results prove the possibility of forecasting the behavior of various designing strategies and processor time based on the properties of Lyapunov's function for the process of designing the circuit.

  19. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  20. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  1. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  2. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  3. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  4. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  5. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  6. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  7. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  8. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  9. Design, control and in situ visualization of gas nitriding processes.

    PubMed

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  10. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  11. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  12. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats

    NASA Astrophysics Data System (ADS)

    Riley, D. A.

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  13. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  14. Application of hazard assessment techniques in the CISF design process

    SciTech Connect

    Thornton, J.R.; Henry, T.

    1997-10-29

    The Department of Energy has submitted to the NRC staff for review a topical safety analysis report (TSAR) for a Centralized Interim Storage Facility (CISF). The TSAR will be used in licensing the CISF when and if a site is designated. CISF1 design events are identified based on thorough review of design basis events (DBEs) previously identified by dry storage system suppliers and licensees and through the application of hazard assessment techniques. A Preliminary Hazards Assessment (PHA) is performed to identify design events applicable to a Phase 1 non site specific CISF. A PHA is deemed necessary since the Phase 1 CISF is distinguishable from previous dry store applications in several significant operational scope and design basis aspects. In addition to assuring all design events applicable to the Phase 1 CISF are identified, the PHA served as an integral part of the CISF design process by identifying potential important to safety and defense in depth facility design and administrative control features. This paper describes the Phase 1 CISF design event identification process and summarizes significant PHA contributions to the CISF design.

  15. COMPUTER ASSISTED PRELIMINARY DESIGN FOR DRINKING WATER TREATMENT PROCESS SYSTEMS

    EPA Science Inventory

    The purpose of the study was to develop an interactive computer program to aid the design engineer in evaluating the performance and cost for any proposed drinking water treatment system consisting of individual unit processes. The 25 unit process models currently in the program ...

  16. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  17. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  18. A computer-assisted process for supersonic aircraft conceptual design

    NASA Technical Reports Server (NTRS)

    Johnson, V. S.

    1985-01-01

    Design methodology was developed and existing major computer codes were selected to carry out the conceptual design of supersonic aircraft. A computer-assisted design process resulted from linking the codes together in a logical manner to implement the design methodology. The process does not perform the conceptual design of a supersonic aircraft but it does provide the designer with increased flexibility, especially in geometry generation and manipulation. Use of the computer-assisted process for the conceptual design of an advanced technology Mach 3.5 interceptor showed the principal benefit of the process to be the ability to use a computerized geometry generator and then directly convert the geometry between formats used in the geometry code and the aerodynamics codes. Results from the interceptor study showed that a Mach 3.5 standoff interceptor with a 1000 nautical-mile mission radius and a payload of eight Phoenix missiles appears to be feasible with the advanced technologies considered. A sensitivity study showed that technologies affecting the empty weight and propulsion system would be critical in the final configuration characteristics with aerodynamics having a lesser effect for small perturbations around the baseline.

  19. Incorporating manufacturability constraints into the design process of heterogeneous objects

    NASA Astrophysics Data System (ADS)

    Hu, Yuna; Blouin, Vincent Y.; Fadel, Georges M.

    2004-11-01

    Rapid prototyping (RP) technology, such as Laser Engineering Net Shaping (LENSTM), can be used to fabricate heterogeneous objects with gradient variations in material composition. These objects are generally characterized by enhanced functional performance. Past research on the design of such objects has focused on representation, modeling, and functional performance. However, the inherent constraints in RP processes, such as system capability and processing time, lead to heterogeneous objects that may not meet the designer's original intent. To overcome this situation, the research presented in this paper focuses on the identification and implementation of manufacturing constraints into the design process. A node-based finite element modeling technique is used for the representation and analysis and the multicriteria design problem corresponds to finding the nodal material compositions that minimize structural weight and maximize thermal performance. The optimizer used in this research is a real-valued Evolutionary Strategies (ES), which is well suited for this type of multi-modal problem. Two limitations of the LENS manufacturing process, which have an impact on the design process, are identified and implemented. One of them is related to the manufacturing time, which is considered as an additional criterion to be minimized in the design problem for a preselected tool path. A brake disc rotor made of two materials, aluminum for lightweight and steel for superior thermal characteristics, is used to illustrate the tradeoff between manufacturability and functionality.

  20. Evaluation of the FCHART/SLR solar design process

    SciTech Connect

    Fanning, M.W.

    1982-01-01

    The actual heating requirements of 137 passive solar houses were compared to the requirements predicted by a typical design process that used the FCHART/SLR design tool. The calculation of the actual space-heating auxiliary energy needed by the houses during the 1979-1980 heating season was based on fuel bills and appliance use information. The prediction for each residence relied on site-specific weather data for that period, on owner-estimated occupancy patterns, and on measured building characteristics. FCHART/SLR was used with this information to predict the space-heating auxiliary. A statistical comparison of the actual and predicted auxiliaries showed that the design process overpredicted the auxiliary requirement by 60% with 112% standard deviation. A simple heat-loss calculation that ignored the solar contribution proved as accurate a predictor of the heating requirement as the solar design process in some cases.

  1. Natural gas operations: considerations on process transients, design, and control.

    PubMed

    Manenti, Flavio

    2012-03-01

    This manuscript highlights tangible benefits deriving from the dynamic simulation and control of operational transients of natural gas processing plants. Relevant improvements in safety, controllability, operability, and flexibility are obtained not only within the traditional applications, i.e. plant start-up and shutdown, but also in certain fields apparently time-independent such as the feasibility studies of gas processing plant layout and the process design of processes. Specifically, this paper enhances the myopic steady-state approach and its main shortcomings with respect to the more detailed studies that take into consideration the non-steady state behaviors. A portion of a gas processing facility is considered as case study. Process transients, design, and control solutions apparently more appealing from a steady-state approach are compared to the corresponding dynamic simulation solutions. PMID:22056010

  2. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals. PMID:16604700

  3. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  4. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  5. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  6. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, Stephen D.; Knorovsky, Gerald A.

    2000-08-16

    A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach facilitates the product design and manufacturing process. Using a Windows-based computer video display and a data base of materials and their properties, the designer can ascertain the preferred composition of two parts based on various operating/environmental constraints such as load, temperature, lifetime, etc. Optimum joinder of the two parts may simultaneously be determined using a joining process data base based upon the selected composition of the components as well as the operating/environmental constraints.

  7. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  8. Design considerations for fume hoods for process plants.

    PubMed

    Goodfellow, H D; Bender, M

    1980-07-01

    Proper design of fume hoods is a necessary requisite for a clean working environment for many industrial processes. Until recently, the design of these hoods has been rather a trial and error approach and not based on sound engineering design principles. Hatch Associates have developed and applied new techniques to establish hood parameters for different industrail processes. The paper reviews the developed techniques and illustrates practical application of these techniques to the solving of difficult and comples fume hood design and operating performance problems. The scope of the paper covers the following subject areas: definitions and general considerations: evaluation of volume and heat flow rates for emission sources; local capture of process emissions; remote capture of process emissions and case studies of fume hood applications. The purpose of the paper is to detail a coherent approach in the analysis of emission problems which will result in the development of an efficient design of a fume capture hood. An efficient fume hood can provide a safe working place as well as a clean external environment. Although the techniques can be applied to smaller sources, the case studies which will be examined will be for fume hoods in the flow design range of 50 000 CFM to +1 000 000 CFM. PMID:7415967

  9. Parametric design methodology for chemical processes using a simulator

    SciTech Connect

    Diwekar, U.M.; Rubin, E.S. )

    1994-02-01

    Parameter design is a method popularized by the Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust in the face of uncontrollable variations. At the design stage, the goal of parameter design is to identify design settings that make the product performance less sensitive to the effects of manufacturing and environmental variations and deterioration. Because parameter design reduces performance variation by reducing the influence of the sources of variation rather than by controlling them, it is a cost-effective technique for improving quality. A recent study on the application of parameter design methodology for chemical processes reported that the use of Taguchi's method was not justified and a method based on Monte Carlo simulation combined with optimization was shown to be more effective. However, this method is computationally intensive as a large number of samples are necessary to achieve the given accuracy. Additionally, determination of the number of sample runs required is based on experimentation due to a lack of systematic sampling methods. In an attempt to overcome these problems, the use of a stochastic modeling capability combined with an optimizer is presented in this paper. The objective is that of providing an effective means for application of parameter design methodologies to chemical processes using the ASPEN simulator. This implementation not only presents a generalized tool for use by chemical engineers at large but also provides systematic estimates of the number of sample runs required to attain the specified accuracy. The stochastic model employs the technique of Latin hypercube sampling instead of the traditional Monte Carlo technique and hence has a great potential to reduce the required number of samples. The methodology is illustrated via an example problem of designing a chemical process.

  10. Design of a Pu-238 Waste Incineration Process

    SciTech Connect

    Charlesworth, D.L.

    2001-05-29

    Combustible Pu-238 waste is generated as a result of normal operation and decommissioning activity at the Savannah River Plant and is being retrievably stored there. As part of the long-term plan to process the stored waste and current waste in preparation for future disposition, a Pu-238 incineration process is being cold-tested at Savannah River Laboratory (SRL). The incineration process consists of a continuous-feed preparation system, a two-stage, electrically fired incinerator, and a filtration off-gas system. Process equipment has been designed, fabricated, and installed for nonradioactive testing and cold run-in. Design features to maximize the ability to remotely maintain the equipment were incorporated into the process. Interlock, alarm, and control functions are provided by a programmable controller. Cold testing is scheduled to be completed in 1986.

  11. Glucose uptake in rat soleus - Effect of acute unloading and subsequent reloading

    NASA Technical Reports Server (NTRS)

    Henriksen, Eric J.; Tischler, Marc E.

    1988-01-01

    The effect of acutely reduced weight bearing (unloading) on the in vitro uptake of 2-1,2-H-3-deoxy-D-glucose was studied in the soleus muscle by tail casting and suspending rats. After just 4 h, the uptake of 2-deoxy-D-glucose fell (-19 percent) and declined further after an additional 20 h of unloading. This diminution at 24 h was associated with slower oxidation of C-14-glucose and incorporation of C-14-glucose into glycogen. At 3 days of unloading, basal uptake of 2-deoxy-D-glucose did not differ from control. Reloading of the soleus after 1 or 3 days of unloading increased uptake of 2-deoxy-D-glucose above control and returned it to normal within 6 h and 4 days, respectively. These effects of unloading and recovery were caused by local changes in the soleus, because the extensor digitorum longus from the same hindlimbs did not display any alterations in uptake of 2-deoxy-D-glucose or metabolism of glucose.

  12. Expertise in professional software design: a process study.

    PubMed

    Sonnentag, S

    1998-10-01

    Forty professional software designers participated in a study in which they worked on a software design task and reported strategies for accomplishing that task. High performers were identified by a peer-nomination method and performance on a design. Verbal protocol analysis based on a comparison of 12 high and 12 moderate performers indicated that high performers structured their design process by local planning and showed more feedback processing, whereas moderate performers were more engaged in analyzing requirements and verbalizing task-irrelevant cognitions. High performers more often described problem comprehension and cooperation with colleagues as useful strategies. High and moderate performers did not differ with respect to length of experience. None of the differences between the two performance groups could be explained by length of experience. PMID:9806013

  13. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  14. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  15. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  16. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  17. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  18. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  19. Rethinking ASIC design with next generation lithography and process integration

    NASA Astrophysics Data System (ADS)

    Vaidyanathan, Kaushik; Liu, Renzhi; Liebmann, Lars; Lai, Kafai; Strojwas, Andrzej; Pileggi, Larry

    2013-03-01

    Given the deployment delays for EUV, several next generation lithography (NGL) options are being actively researched. Several cost-effective NGL solutions, such as self-aligned double patterning through sidewall image transfer (SIT) and directed self-assembly (DSA), in conjunction with process integration challenges, mandate grating-like pattern design. As part of the GRATEdd project, we have evaluated the design cost of grating-based design for ASICs (application specific ICs). Based on our observations we have engineered fundamental changes to the primary ASIC design components to make scaling affordable and useful in deeply scaled sub-20 nm technologies: unidirectional-M1 based standard cells, application-specific smart SRAM synthesis, and statistical and self-healing analog design.

  20. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  1. Rethinking behavioral health processes by using design for six sigma.

    PubMed

    Lucas, Anthony G; Primus, Kelly; Kovach, Jamison V; Fredendall, Lawrence D

    2015-02-01

    Clinical evidence-based practices are strongly encouraged and commonly utilized in the behavioral health community. However, evidence-based practices that are related to quality improvement processes, such as Design for Six Sigma, are often not used in behavioral health care. This column describes the unique partnership formed between a behavioral health care provider in the greater Pittsburgh area, a nonprofit oversight and monitoring agency for behavioral health services, and academic researchers. The authors detail how the partnership used the multistep process outlined in Design for Six Sigma to completely redesign the provider's intake process. Implementation of the redesigned process increased access to care, decreased bad debt and uncollected funds, and improved cash flow--while consumer satisfaction remained high. PMID:25642607

  2. Informed Design: A Contemporary Approach to Design Pedagogy as the Core Process in Technology

    ERIC Educational Resources Information Center

    Burghardt, M. David; Hacker, Michael

    2004-01-01

    In classroom settings, most problems are usually well defined, so students have little experience with open-ended problems. Technological design problems, however, are seldom well defined. The design process begins with broad ideas and concepts and continues in the direction of ever-increasing detail, resulting in an acceptable solution. So using…

  3. Integrated Design System (IDS) Tools for the Spacecraft Aeroassist/Entry Vehicle Design Process

    NASA Technical Reports Server (NTRS)

    Olynick, David; Braun, Robert; Langhoff, Steven R. (Technical Monitor)

    1997-01-01

    The definition of the Integrated Design System technology focus area as presented in the NASA Information Technology center of excellence strategic plan is described. The need for IDS tools in the aeroassist/entry vehicle design process is illustrated. Initial and future plans for spacecraft IDS tool development are discussed.

  4. Which Events Can Cause Iteration in Instructional Design? An Empirical Study of the Design Process

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2006-01-01

    Instructional design is not a linear process: designers have to weigh the advantages and disadvantages of alternative solutions, taking into account different kinds of conflicting and changing constraints. To make sure that they eventually choose the most optimal one, they have to keep on collecting information, reconsidering continuously whether…

  5. Motivating the Notion of Generic Design within Information Processing Theory: The Design Problem Space.

    ERIC Educational Resources Information Center

    Goel, Vinod; Pirolli, Peter

    The notion of generic design, while it has been around for 25 years, is not often articulated, especially within Newell and Simon's (1972) Information Processing Theory framework. Design is merely lumped in with other forms of problem solving activity. Intuitively it is felt that there should be a level of description of the phenomenon which…

  6. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  7. Inverse Analysis to Formability Design in a Deep Drawing Process

    NASA Astrophysics Data System (ADS)

    Buranathiti, Thaweepat; Cao, Jian

    Deep drawing process is an important process adding values to flat sheet metals in many industries. An important concern in the design of a deep drawing process generally is formability. This paper aims to present the connection between formability and inverse analysis (IA), which is a systematical means for determining an optimal blank configuration for a deep drawing process. In this paper, IA is presented and explored by using a commercial finite element software package. A number of numerical studies on the effect of blank configurations to the quality of a part produced by a deep drawing process were conducted and analyzed. The quality of the drawing processes is numerically analyzed by using an explicit incremental nonlinear finite element code. The minimum distance between elemental principal strains and the strain-based forming limit curve (FLC) is defined as tearing margin to be the key performance index (KPI) implying the quality of the part. The initial blank configuration has shown that it plays a highly important role in the quality of the product via the deep drawing process. In addition, it is observed that if a blank configuration is not greatly deviated from the one obtained from IA, the blank can still result a good product. The strain history around the bottom fillet of the part is also observed. The paper concludes that IA is an important part of the design methodology for deep drawing processes.

  8. Noise control, sound, and the vehicle design process

    NASA Astrophysics Data System (ADS)

    Donavan, Paul

    2005-09-01

    For many products, noise and sound are viewed as necessary evils that need to be dealt with in order to bring the product successfully to market. They are generally not product ``exciters'' although some vehicle manufacturers do tune and advertise specific sounds to enhance the perception of their products. In this paper, influencing the design process for the ``evils,'' such as wind noise and road noise, are considered in more detail. There are three ingredients to successfully dealing with the evils in the design process. The first of these is knowing how excesses in noise effects the end customer in a tangible manner and how that effects customer satisfaction and ultimately sells. The second is having and delivering the knowledge of what is required of the design to achieve a satisfactory or even better level of noise performance. The third ingredient is having the commitment of the designers to incorporate the knowledge into their part, subsystem or system. In this paper, the elements of each of these ingredients are discussed in some detail and the attributes of a successful design process are enumerated.

  9. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  10. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  11. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  12. The Role of Dialogic Processes in Designing Career Expectations

    ERIC Educational Resources Information Center

    Bangali, Marcelline; Guichard, Jean

    2012-01-01

    This article examines the role played by dialogic processes in the designing or redesigning of future expectations during a career guidance intervention. It discusses a specific method ("Giving instruction to a double") developed and used during career counseling sessions with two recent doctoral graduates. It intends both to help them outline or…

  13. GREENING OF OXIDATION CATALYSIS THROUGH IMPROVED CATALYST AND PROCESS DESIGN

    EPA Science Inventory


    Greening of Oxidation Catalysis Through Improved Catalysts and Process Design
    Michael A. Gonzalez*, Thomas Becker, and Raymond Smith

    United State Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory, 26 W...

  14. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product…

  15. Processing and circuit design enhance a data converter's radiation tolerance

    SciTech Connect

    Heuner, R.; Zazzu, V.; Pennisi, L.

    1988-12-01

    Rad-hard CMOS/SOS processing has been applied to a novel comparator-inverter circuit design to develop 6 and 8-bit parallel (flash) ADC (analog-to-digital converter) circuits featuring high-speed operation, low power consumption, and total-dose radiation tolerances up to 1 Mrad(Si).

  16. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  17. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Approximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use this infor...

  18. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Aproximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use tis inform...

  19. USING GENETIC ALGORITHMS TO DESIGN ENVIRONMENTALLY FRIENDLY PROCESSES

    EPA Science Inventory

    Genetic algorithm calculations are applied to the design of chemical processes to achieve improvements in environmental and economic performance. By finding the set of Pareto (i.e., non-dominated) solutions one can see how different objectives, such as environmental and economic ...

  20. Ingenuity in Action: Connecting Tinkering to Engineering Design Processes

    ERIC Educational Resources Information Center

    Wang, Jennifer; Werner-Avidon, Maia; Newton, Lisa; Randol, Scott; Smith, Brooke; Walker, Gretchen

    2013-01-01

    The Lawrence Hall of Science, a science center, seeks to replicate real-world engineering at the "Ingenuity in Action" exhibit, which consists of three open-ended challenges. These problems encourage children to engage in engineering design processes and problem-solving techniques through tinkering. We observed and interviewed 112…

  1. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    ERIC Educational Resources Information Center

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  2. Developing 21st Century Process Skills through Project Design

    ERIC Educational Resources Information Center

    Yoo, Jeong-Ju; MacDonald, Nora M.

    2014-01-01

    The goal of this paper is to illustrate how the promotion of 21st Century process skills can be used to enhance student learning and workplace skill development: thinking, problem solving, collaboration, communication, leadership, and management. As an illustrative case, fashion merchandising and design students conducted research for a…

  3. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  4. PRELIMINARY DESIGN FOR DRINKING WATER TREATMENT PROCESS SYSTEMS

    EPA Science Inventory

    A computer model has been developed for use in estimating the performance and associated costs of proposed and existing water supply systems. Design procedures and cost-estimating relationships for 25 unit processes that can be used for drinking water treatment are contained with...

  5. INCORPORATING INDUSTRIAL ECOLOGY INTO HIERARCHICAL CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    Incorporating Industrial Ecology into Hierarchical Chemical Process Design: Determining Targets for the Exchange of Waste

    The exchange of waste to be used as a recycled feed has long been encouraged by practitioners of industrial ecology. Industrial ecology is a field t...

  6. Experiential Learning: A Course Design Process for Critical Thinking

    ERIC Educational Resources Information Center

    Hamilton, Janet G.; Klebba, Joanne M.

    2011-01-01

    This article describes a course design process to improve the effectiveness of using experiential learning techniques to foster critical thinking skills. The authors examine prior research to identify essential dimensions of experiential learning in relation to higher order thinking. These dimensions provide key insights for the selection of…

  7. Design characteristics for facilities which process hazardous particulate

    SciTech Connect

    Abeln, S.P.; Creek, K.; Salisbury, S.

    1998-12-01

    Los Alamos National Laboratory is establishing a research and processing capability for beryllium. The unique properties of beryllium, including light weight, rigidity, thermal conductivity, heat capacity, and nuclear properties make it critical to a number of US defense and aerospace programs. Concomitant with the unique engineering properties are the health hazards associated with processing beryllium in a particulate form and the potential for worker inhalation of aerosolized beryllium. Beryllium has the lowest airborne standard for worker protection compared to all other nonradioactive metals by more than an order of magnitude. This paper describes the design characteristics of the new beryllium facility at Los Alamos as they relate to protection of the workforce. Design characteristics to be reviewed include; facility layout, support systems to minimize aerosol exposure and spread, and detailed review of the ventilation system design for general room air cleanliness and extraction of particulate at the source.

  8. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature. PMID:19458728

  9. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    PubMed

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  10. The role of CFD in the design process

    NASA Astrophysics Data System (ADS)

    Jennions, Ian K.

    1994-05-01

    Over the last decade the role played by CFD codes in turbomachinery design has changed remarkably. While convergence/stability or even the existence of unique solutions was discussed fervently ten years ago, CFD codes now form a valuable part of an overall integrated design system and have caused us to re-think much of what we do. The geometric and physical complexities addressed have also evolved, as have the number of software houses competing with in-house developers to provide solutions to daily design problems. This paper reviews how GE Aircraft Engines (GEAE) uses CFD in the turbomachinery design process and examines many of the issues faced in successful code implementation.

  11. Computer aided microbial safety design of food processes.

    PubMed

    Schellekens, M; Martens, T; Roberts, T A; Mackey, B M; Nicolaï, B M; Van Impe, J F; De Baerdemaeker, J

    1994-12-01

    To reduce the time required for product development, to avoid expensive experimental tests, and to quantify safety risks for fresh products and the consequence of processing there is a growing interest in computer aided food process design. This paper discusses the application of hybrid object-oriented and rule-based expert system technology to represent the data and knowledge of microbial experts and food engineers. Finite element models for heat transfer calculation routines, microbial growth and inactivation models and texture kinetics are combined with food composition data, thermophysical properties, process steps and expert knowledge on type and quantity of microbial contamination. A prototype system has been developed to evaluate changes in food composition, process steps and process parameters on microbiological safety and textual quality of foods. PMID:7703003

  12. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  13. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  14. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  15. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  16. Designed CVD growth of graphene via process engineering.

    PubMed

    Yan, Kai; Fu, Lei; Peng, Hailin; Liu, Zhongfan

    2013-10-15

    Graphene, the atomic thin carbon film with honeycomb lattice, holds great promise in a wide range of applications, due to its unique band structure and excellent electronic, optical, mechanical, and thermal properties. Scientists are researching this star material because of the development of various emerging preparation techniques, among which chemical vapor deposition (CVD) has received the fastest advances in the past few years. For the CVD growth of graphene, the ultimate goal is to achieve the highest quality in the largest scale and lowest cost with a precise control of layer thickness, stacking order, and crystallinity. To meet this goal, researchers need a comprehensive understanding and effective controlling of the growth process, especially to its elementary steps. In this Account, we focus on our recent progresses toward the controlled surface growth of graphene and its two-dimensional (2D) hybrids via rational designs of CVD elementary processes, namely, process engineering. A typical CVD process consists of four main elementary steps: (A) adsorption and catalytic decomposition of precursor gas, (B) diffusion and dissolution of decomposed carbon species into bulk metal, (C) segregation of dissolved carbon atoms onto the metal surface, and finally, (D) surface nucleation and growth of graphene. Absence or enhancement of each elementary step would lead to significant changes in the whole growth process. Metals with certain carbon solubility, such as nickel and cobalt, involve all four elementary steps in a typical CVD process, thus providing us an ideal system for process engineering. The elementary segregation process can be completely blocked if molybdenum is introduced into the system as an alloy catalyst, yielding perfect monolayer graphene almost independent of growth parameters. On the other hand, the segregation-only process of predissolved solid carbons is also capable of high-quality graphene growth. By using a synergetic Cu-Ni alloy, we are

  17. Innovative soil treatment process design for removal of trivalent chromium

    SciTech Connect

    Stallings, J.H.; Durkin, M.E.

    1997-12-31

    A soil treatment process has been developed as part of a US Air Force environmental compliance project at Air Force Plant 44, Tucson, AZ for treating soil contaminated with heavy metals including trivalent chromium, cadmium, copper, and nickel. The process was designed to treat a total of 133,000 tons of soil in a 400 ton per day facility. Features of the soil treatment process include physical treatment and separation, and a chemical treatment process of the remaining fines using a hypochlorite leach allowing chromium to be solubilized at a high pH. After treating, fines are washed in three stage countercurrent thickeners and chromium hydroxide cake is recovered as a final produce from the leach solution. Treatability studies were conducted, laboratory and a pilot plant was built. Process design criteria and flow sheet, material balances, as well as preliminary equipment selection and sizing for the facility have been completed. Facility was designed for the removal of Cr at a concentration of an average of 1230 mg/kg from the soil and meeting a risk based clean-closure limit of 400 mg/kg of Cr. Capital costs for the 400 tpd plant were estimated at 9.6 million with an operating and maintenance cost of $54 per ton As process is most economic for large quantities of soil with relatively low concentrations of contaminants, it was not used in final closure when the estimated volume of contaminated soil removed dropped to 65,000 tons and concentration of chromium increased up to 4000 mg/kg. However, the process could have application in situations where economics and location warrant.

  18. Designing large-scale conservation corridors for pattern and process.

    PubMed

    Rouget, Mathieu; Cowling, Richard M; Lombard, Amanda T; Knight, Andrew T; Kerley, Graham I H

    2006-04-01

    A major challenge for conservation assessments is to identify priority areas that incorporate biological patterns and processes. Because large-scale processes are mostly oriented along environmental gradients, we propose to accommodate them by designing regional-scale corridors to capture these gradients. Based on systematic conservation planning principles such as representation and persistence, we identified large tracts of untransformed land (i.e., conservation corridors) for conservation that would achieve biodiversity targets for pattern and process in the Subtropical Thicket Biome of South Africa. We combined least-cost path analysis with a target-driven algorithm to identify the best option for capturing key environmental gradients while considering biodiversity targets and conservation opportunities and constraints. We identified seven conservation corridors on the basis of subtropical thicket representation, habitat transformation and degradation, wildlife suitability, irreplaceability of vegetation types, protected area networks, and future land-use pressures. These conservation corridors covered 21.1% of the planning region (ranging from 600 to 5200 km2) and successfully achieved targets for biological processes and to a lesser extent for vegetation types. The corridors we identified are intended to promote the persistence of ecological processes (gradients and fixed processes) and fulfill half of the biodiversity pattern target. We compared the conservation corridors with a simplified corridor design consisting of a fixed-width buffer along major rivers. Conservation corridors outperformed river buffers in seven out of eight criteria. Our corridor design can provide a tool for quantifying trade-offs between various criteria (biodiversity pattern and process, implementation constraints and opportunities). A land-use management model was developed to facilitate implementation of conservation actions within these corridors. PMID:16903115

  19. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  20. A Taguchi study of the aeroelastic tailoring design process

    NASA Technical Reports Server (NTRS)

    Bohlmann, Jonathan D.; Scott, Robert C.

    1991-01-01

    A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.

  1. Operation and design of selected industrial process heat field tests

    SciTech Connect

    Kearney, D. W.

    1981-02-01

    The DOE program of solar industrial process heat field tests has shown solar energy to be compatible with numerous industrial needs. Both the operational projects and the detailed designs of systems that are not yet operational have resulted in valuable insights into design and hardware practice. Typical of these insights are the experiences discussed for the four projects reviewed. Future solar IPH systems should benefit greatly not only from the availability of present information, but also from the wealth of operating experience from projects due to start up in 1981.

  2. Design of the HTGR for process heat applications

    SciTech Connect

    Vrable, D.L.; Quade, R.N.

    1980-05-01

    This paper discusses a design study of an advanced 842-MW(t) HTGR with a reactor outlet temperature of 850/sup 0/C (1562/sup 0/F), coupled with a chemical process whose product is hydrogen (or a mixture of hydrogen and carbon monoxide) generated by steam reforming of a light hydrocarbon mixture. This paper discusses the plant layout and design for the major components of the primary and secondary heat transfer systems. Typical parametric system study results illustrate the capability of a computer code developed to model the plant performance and economics.

  3. Remote Maintenance Design Guide for Compact Processing Units

    SciTech Connect

    Draper, J.V.

    2000-07-13

    Oak Ridge National Laboratory (ORNL) Robotics and Process Systems (RPSD) personnel have extensive experience working with remotely operated and maintained systems. These systems require expert knowledge in teleoperation, human factors, telerobotics, and other robotic devices so that remote equipment may be manipulated, operated, serviced, surveyed, and moved about in a hazardous environment. The RPSD staff has a wealth of experience in this area, including knowledge in the broad topics of human factors, modular electronics, modular mechanical systems, hardware design, and specialized tooling. Examples of projects that illustrate and highlight RPSD's unique experience in remote systems design and application include the following: (1) design of a remote shear and remote dissolver systems in support of U.S. Department of Energy (DOE) fuel recycling research and nuclear power missions; (2) building remotely operated mobile systems for metrology and characterizing hazardous facilities in support of remote operations within those facilities; (3) construction of modular robotic arms, including the Laboratory Telerobotic Manipulator, which was designed for the National Aeronautics and Space Administration (NASA) and the Advanced ServoManipulator, which was designed for the DOE; (4) design of remotely operated laboratories, including chemical analysis and biochemical processing laboratories; (5) construction of remote systems for environmental clean up and characterization, including underwater, buried waste, underground storage tank (UST) and decontamination and dismantlement (D&D) applications. Remote maintenance has played a significant role in fuel reprocessing because of combined chemical and radiological contamination. Furthermore, remote maintenance is expected to play a strong role in future waste remediation. The compact processing units (CPUs) being designed for use in underground waste storage tank remediation are examples of improvements in systems processing

  4. A Review of the Design Process for Implantable Orthopedic Medical Devices

    PubMed Central

    Aitchison, G.A; Hukins, D.W.L; Parry, J.J; Shepherd, D.E.T; Trotman, S.G

    2009-01-01

    The design process for medical devices is highly regulated to ensure the safety of patients. This paper will present a review of the design process for implantable orthopedic medical devices. It will cover the main stages of feasibility, design reviews, design, design verification, manufacture, design validation, design transfer and design changes. PMID:19662153

  5. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  6. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  7. Process Design Concepts for Stabilization of High Level Waste Calcine

    SciTech Connect

    T. R. Thomas; A. K. Herbst

    2005-06-01

    The current baseline assumption is that packaging ¡§as is¡¨ and direct disposal of high level waste (HLW) calcine in a Monitored Geologic Repository will be allowed. The fall back position is to develop a stabilized waste form for the HLW calcine, that will meet repository waste acceptance criteria currently in place, in case regulatory initiatives are unsuccessful. A decision between direct disposal or a stabilization alternative is anticipated by June 2006. The purposes of this Engineering Design File (EDF) are to provide a pre-conceptual design on three low temperature processes under development for stabilization of high level waste calcine (i.e., the grout, hydroceramic grout, and iron phosphate ceramic processes) and to support a down selection among the three candidates. The key assumptions for the pre-conceptual design assessment are that a) a waste treatment plant would operate over eight years for 200 days a year, b) a design processing rate of 3.67 m3/day or 4670 kg/day of HLW calcine would be needed, and c) the performance of waste form would remove the HLW calcine from the hazardous waste category, and d) the waste form loadings would range from about 21-25 wt% calcine. The conclusions of this EDF study are that: (a) To date, the grout formulation appears to be the best candidate stabilizer among the three being tested for HLW calcine and appears to be the easiest to mix, pour, and cure. (b) Only minor differences would exist between the process steps of the grout and hydroceramic grout stabilization processes. If temperature control of the mixer at about 80„aC is required, it would add a major level of complexity to the iron phosphate stabilization process. (c) It is too early in the development program to determine which stabilizer will produce the minimum amount of stabilized waste form for the entire HLW inventory, but the volume is assumed to be within the range of 12,250 to 14,470 m3. (d) The stacked vessel height of the hot process vessels

  8. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  9. Robust process design and springback compensation of a decklid inner

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaojing; Grimm, Peter; Carleer, Bart; Jin, Weimin; Liu, Gang; Cheng, Yingchao

    2013-12-01

    Springback compensation is one of the key topics in current die face engineering. The accuracy of the springback simulation, the robustness of method planning and springback are considered to be the main factors which influences the effectiveness of springback compensation. In the present paper, the basic principles of springback compensation are presented firstly. These principles consist of an accurate full cycle simulation with final validation setting and the robust process design and optimization are discussed in detail via an industrial example, a decklid inner. Moreover, an effective compensation strategy is put forward based on the analysis of springback and the simulation based springback compensation is introduced in the phase of process design. In the end, the final verification and comparison in tryout and production is given in this paper, which verified that the methodology of robust springback compensation is effective during the die development.

  10. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  11. Using Process Visualizations to Validate Electronic Form Design

    PubMed Central

    Marquard, Jenna L.; Mei, Yi You

    2010-01-01

    Electronic reporting systems have the potential to support health care quality improvement initiatives across varied health care settings, specifically in low-technology settings such as long-term residential care facilities (LTRCFs). Yet, these organizations face financial barriers to implementing such systems and the LTRCF workforce is generally not as technology-ready as larger organizations’ workforces. Electronic reporting systems implemented in these settings must therefore be inexpensive and easy-to-use. This paper outlines a novel technique – process visualization – for systematically assessing the order in which users complete electronic forms, an inexpensively-developed patient falls reporting form in this case. These visualizations can help designers uncover usage patterns not evident via other usability methods. Based on this knowledge, designers can validate the design of the electronic forms, informing their subsequent redesign. PMID:21347028

  12. Characterizing Building Construction Decision Processes to Enhance DOE Program Design

    SciTech Connect

    Hostick, Donna J.; Slavich, Antoinette L.; Larson, Lars E.; Hostick, Cody J.; Skumanich, Marina; Crawford, Marjorie A.; Weber, Tami M.

    2003-10-01

    There is an established process for the design and construction of buildings. While the particulars will vary greatly from one project to the next, the players (e.g., architects, owners, supplies, builders) and activities (e.g., design, specify, construct) are basically the same, as are the decisions (e.g., which windows where, what type of heating system). The U.S. Department of Energy's (DOEs) Office of Energy Efficiency and Renewable Energy (EERE) tasked Pacific Northwest National Laboratory (PNNL) with the development of a formal framework that could be used to analyze the critical decision path for energy efficient technologies in the construction of buildings. The goal was to demonstrate whether these technologies could be related to decision points in the construction process, the decision makers, and a rudimentary understanding of what helped to form those decisions. The theory to be tested is whether this Critical Path Analysis can enhance project planning and design. A continuous goal of EERE is to increase the effectiveness of its efforts through better targeting of projects. This requires a good understanding of the markets in which EERE technologies and practices, as developed or implemented by those projects, must compete. One significant measure of project success is market adoption of EERE technologies and practices. The goal of this study is to characterize the typical design, construction, and building renovation decision points and decision makers to see if this information could prove useful to DOE Project Managers by helping them understand how market adoption decisions are made. The approach of this study is to develop a framework characterizing decision points, decision makers, and decision influences in the building industry. As many building design and construction decisions are time-sequenced and constrained by earlier decisions, the framework selected is based on a critical path characterization of the design and construction process

  13. Improving Tools and Processes in Mechanical Design Collaboration

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2009-01-01

    Cooperative product development projects in the aerospace and defense industry are held hostage to high cost and risk due to poor alignment of collaborative design tools and processes. This impasse can be broken if companies will jointly develop implementation approaches and practices in support of high value working arrangements. The current tools can be used to better advantage in many situations and there is reason for optimism that tool vendors will provide significant support.

  14. Space Station Freedom pressurized element interior design process

    NASA Technical Reports Server (NTRS)

    Hopson, George D.; Aaron, John; Grant, Richard L.

    1990-01-01

    The process used to develop the on-orbit working and living environment of the Space Station Freedom has some very unique constraints and conditions to satisfy. The goal is to provide maximum efficiency and utilization of the available space, in on-orbit, zero G conditions that establishes a comfortable, productive, and safe working environment for the crew. The Space Station Freedom on-orbit living and working space can be divided into support for three major functions: (1) operations, maintenance, and management of the station; (2) conduct of experiments, both directly in the laboratories and remotely for experiments outside the pressurized environment; and (3) crew related functions for food preparation, housekeeping, storage, personal hygiene, health maintenance, zero G environment conditioning, and individual privacy, and rest. The process used to implement these functions, the major requirements driving the design, unique considerations and constraints that influence the design, and summaries of the analysis performed to establish the current configurations are described. Sketches and pictures showing the layout and internal arrangement of the Nodes, U.S. Laboratory and Habitation modules identify the current design relationships of the common and unique station housekeeping subsystems. The crew facilities, work stations, food preparation and eating areas (galley and wardroom), and exercise/health maintenance configurations, waste management and personal hygiene area configuration are shown. U.S. Laboratory experiment facilities and maintenance work areas planned to support the wide variety and mixtures of life science and materials processing payloads are described.

  15. Design of multichannel image processing on the Space Solar Telescope

    NASA Astrophysics Data System (ADS)

    Zhang, Bin

    2000-07-01

    The multi-channel image processing system on the Space Solar Telescope (SST) is described in this paper. This system is main part of science data unit (SDU), which is designed for dealing with the science data from every payload on the SST. First every payload on the SST and its scientific objective are introduced. They are main optic telescope, four soft X- ray telescopes, an H-alpha and white light (full disc) telescope, a coronagraph, a wide band X-ray and Gamma-ray spectrometer, and a solar and interplanetary radio spectrometer. Then the structure of SDU is presented. In this part, we discuss the hardware and software structure of SDU, which is designed for multi-payload. The science data scream of every payload is summarized, too. Solar magnetic and velocity field processing that occupies more than 90% of the data processing of SDU is discussed, which includes polarizing unit, image receiver and image adding unit. Last the plan of image data compression and mass memory that is designed for science data storage are presented.

  16. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  17. Development of the Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Gruber, Christopher R.

    2004-01-01

    The aerodynamic development of an engine inlet requires a comprehensive program of both wind tunnel testing and Computational Fluid Dynamics (CFD) simulations. To save time and resources, much "testing" is done using CFD before any design ever enters a wind tunnel. The focus of my project this summer is on CFD analysis tool development. In particular, I am working to further develop the capabilities of the Planar Inlet Design and Analysis Process (PINDAP). "PINDAP" is a collection of computational tools that allow for efficient and accurate design and analysis of the aerodynamics about and through inlets that can make use of a planar (two-dimensional or axisymmetric) geometric and flow assumption. PINDAP utilizes the WIND CFD flow solver, which is capable of simulating the turbulent, compressible flow field. My project this summer is a continuation of work that I performed for two previous summers. Two years ago, I used basic features of the PINDAP to design a Mach 5 hypersonic scramjet engine inlet and to demonstrate the feasibility of the PINDAP. The following summer, I worked to develop its geometry and grid generation capabilities to include subsonic and supersonic inlets, complete bodies and cowls, conic leading and trailing edges, as well as airfoils. These additions allowed for much more design flexibility when using the program.

  18. ArF processing of 90-nm design rule lithography achieved through enhanced thermal processing

    NASA Astrophysics Data System (ADS)

    Kagerer, Markus; Miller, Daniel; Chang, Wayne; Williams, Daniel J.

    2006-03-01

    As the lithography community has moved to ArF processing on 300 mm wafers for 90 nm design rules the process characterization of the components of variance continues to highlight the thermal requirements for the post exposure bake (PEB) processing step. In particular as the thermal systems have become increasingly uniform, the transient behavior of the thermal processing system has received the focus of attention. This paper demonstrates how a newly designed and patented thermal processing system was optimized for delivering improved thermal uniformity during a typical 90 second PEB processing cycle, rather than being optimized for steady state performance. This was accomplished with the aid of a wireless temperature measurement wafer system for obtaining real time temperature data and by using a response surface model (RSM) experimental design for optimizing parameters of the temperature controller of the thermal processing system. The new units were field retrofitted seamlessly in <2 days at customer sites without disruption to process recipes or flows. After evaluating certain resist parameters such as PEB temperature sensitivity and post exposure delay (PED) - stability of the baseline process, the new units were benchmarked against the previous PEB plates by processing a split lot experiment. Additional hardware characterization included environmental factors such as air velocity in the vicinity of the PEB plates and transient time between PEB and chill plate. At the completion of the optimization process, the within wafer CD uniformity displayed a significant improvement when compared to the previous hardware. The demonstrated within wafer CD uniformity improved by 27% compared to the initial hardware and baseline process. ITRS requirements for the 90 nm node were exceeded.

  19. Moving bed biofilm reactor technology: process applications, design, and performance.

    PubMed

    McQuarrie, James P; Boltz, Joshua P

    2011-06-01

    The moving bed biofilm reactor (MBBR) can operate as a 2- (anoxic) or 3-(aerobic) phase system with buoyant free-moving plastic biofilm carriers. These systems can be used for municipal and industrial wastewater treatment, aquaculture, potable water denitrification, and, in roughing, secondary, tertiary, and sidestream applications. The system includes a submerged biofilm reactor and liquid-solids separation unit. The MBBR process benefits include the following: (1) capacity to meet treatment objectives similar to activated sludge systems with respect to carbon-oxidation and nitrogen removal, but requires a smaller tank volume than a clarifier-coupled activated sludge system; (2) biomass retention is clarifier-independent and solids loading to the liquid-solids separation unit is reduced significantly when compared with activated sludge systems; (3) the MBBR is a continuous-flow process that does not require a special operational cycle for biofilm thickness, L(F), control (e.g., biologically active filter backwashing); and (4) liquid-solids separation can be achieved with a variety of processes, including conventional and compact high-rate processes. Information related to system design is fragmented and poorly documented. This paper seeks to address this issue by summarizing state-of-the art MBBR design procedures and providing the reader with an overview of some commercially available systems and their components. PMID:21751715

  20. System Design For A Dental Image Processing System

    NASA Astrophysics Data System (ADS)

    Cady, Fredrick M.; Stover, John C.; Senecal, William J.

    1988-12-01

    An image processing system for a large clinic dental practice has been designed and tested. An analysis of spatial resolution requirements and field tests by dentists show that a system built with presently available, PC-based, image processing equipment can provide diagnostic quality images without special digital image processing. By giving the dentist a tool to digitally enhance x-ray images, increased diagnostic capabilities can be achieved. Very simple image processing procedures such as linear and non-linear contrast expansion, edge enhancement, and image zooming can be shown to be very effective. In addition to providing enhanced imagery in the dentist's treatment room, the system is designed to be a fully automated, dental records management system. It is envisioned that a patient's record, including x-rays and tooth charts, may be retrieved from optical disk storage as the patient enters the office. Dental procedures undertaken during the visit may be entered into the record via the imaging workstation by the dentist or the dental assistant. Patient billing and records keeping may be generated automatically.

  1. System design and performances of ASTER Level-1 data processing

    NASA Astrophysics Data System (ADS)

    Nishida, Sumiyuki; Hachiya, Jun; Matsumoto, Ken; Fujisada, Hiroyuki; Kato, Masatane

    1998-12-01

    ASTER is a multispectral imager which covers wide spectral region from visible to thermal infrared with 14 spectral bands, and will fly on EOS-AM1 in 1999. To meet this wide spectral coverage, ASTER has three optical sensing subsystems (multi-telescope system), VNIR, SWIR and TIR. This multi- telescope configuration requires highly refined ground processing for the generation of Level-1 data products that are radiometrically calibrated and geometrically corrected. A prototype Level-1 processing software system is developed to satisfy these requirements. System design concept adopted includes; (1) 'Automatic Processing,' (2)'ALL-IN-ONE-CONCEPT' in which the processing is carried out using information included in Level-0 data product only, (3) 'MODULE INDEPENDENCE' in which only process control module independently control other modules to change any operational conditions. (4) 'FLEXIBILITY' in which important operation parameters are set from an external component to make the processing condition change easier. The adaptability and the performance of the developed software system are evaluated using simulation data.

  2. Material, process, and product design of thermoplastic composite materials

    NASA Astrophysics Data System (ADS)

    Dai, Heming

    Thermoplastic composites made of polypropylene (PP) and E-glass fibers were investigated experimentally as well as theoretically for two new classes of product designs. The first application was for reinforcement of wood. Commingled PP/glass yarn was consolidated and bonded on wood panel using a tie layer. The processing parameters, including temperature, pressure, heating time, cooling time, bonding strength, and bending strength were tested experimentally and evaluated analytically. The thermoplastic adhesive interface was investigated with environmental scanning electron microscopy. The wood/composite structural design was optimized and evaluated using a Graphic Method. In the second application, we evaluated use of thermoplastic composites for explosion containment in an arrester. PP/glass yarn was fabricated in a sleeve form and wrapped around the arrester. After consolidation, the flexible composite sleeve forms a solid composite shell. The composite shell acts as a protection layer in a surge test to contain the fragments of the arrester. The manufacturing process for forming the composite shell was designed. Woven, knitted, and braided textile composite shells made of commingled PP/glass yarn were tested and evaluated. Mechanical performance of the woven, knitted, and braided composite shells was examined analytically. The theoretical predictions were used to verify the experimental results.

  3. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  4. Co-Simulation for Advanced Process Design and Optimization

    SciTech Connect

    Stephen E. Zitney

    2009-01-01

    Meeting the increasing demand for clean, affordable, and secure energy is arguably the most important challenge facing the world today. Fossil fuels can play a central role in a portfolio of carbon-neutral energy options provided CO{sub 2} emissions can be dramatically reduced by capturing CO{sub 2} and storing it safely and effectively. Fossil energy industry faces the challenge of meeting aggressive design goals for next-generation power plants with CCS. Process designs will involve large, highly-integrated, and multipurpose systems with advanced equipment items with complex geometries and multiphysics. APECS is enabling software to facilitate effective integration, solution, and analysis of high-fidelity process/equipment (CFD) co-simulations. APECS helps to optimize fluid flow and related phenomena that impact overall power plant performance. APECS offers many advanced capabilities including ROMs, design optimization, parallel execution, stochastic analysis, and virtual plant co-simulations. NETL and its collaborative R&D partners are using APECS to reduce the time, cost, and technical risk of developing high-efficiency, zero-emission power plants with CCS.

  5. Structure and Functional Characteristics of Rat's Left Ventricle Cardiomyocytes under Antiorthostatic Suspension of Various Duration and Subsequent Reloading

    PubMed Central

    Ogneva, I. V.; Mirzoev, T. M.; Biryukov, N. S.; Veselova, O. M.; Larina, I. M.

    2012-01-01

    The goal of the research was to identify the structural and functional characteristics of the rat's left ventricle under antiorthostatic suspension within 1, 3, 7 and 14 days, and subsequent 3 and 7-day reloading after a 14-day suspension. The transversal stiffness of the cardiomyocyte has been determined by the atomic force microscopy, cell respiration—by polarography and proteins content—by Western blotting. Stiffness of the cortical cytoskeleton increases as soon as one day after the suspension and increases up to the 14th day, and starts decreasing during reloading, reaching the control level after 7 days. The stiffness of the contractile apparatus and the intensity of cell respiration also increases. The content of non-muscle isoforms of actin in the cytoplasmic fraction of proteins does not change during the whole experiment, as does not the beta-actin content in the membrane fraction. The content of gamma-actin in the membrane fraction correlates with the change in the transversal stiffness of the cortical cytoskeleton. Increased content of alpha-actinin-1 and alpha-actinin-4 in the membrane fraction of proteins during the suspension is consistent with increased gamma-actin content there. The opposite direction of change of alpha-actinin-1 and alpha-actinin-4 content suggests their involvement into the signal pathways. PMID:23093854

  6. Safeguards design strategies: designing and constructing new uranium and plutonium processing facilities in the United States

    SciTech Connect

    Scherer, Carolynn P; Long, Jon D

    2010-09-28

    In the United States, the Department of Energy (DOE) is transforming its outdated and oversized complex of aging nuclear material facilities into a smaller, safer, and more secure National Security Enterprise (NSE). Environmental concerns, worker health and safety risks, material security, reducing the role of nuclear weapons in our national security strategy while maintaining the capability for an effective nuclear deterrence by the United States, are influencing this transformation. As part of the nation's Uranium Center of Excellence (UCE), the Uranium Processing Facility (UPF) at the Y-12 National Security Complex in Oak Ridge, Tennessee, will advance the U.S.'s capability to meet all concerns when processing uranium and is located adjacent to the Highly Enriched Uranium Materials Facility (HEUMF), designed for consolidated storage of enriched uranium. The HEUMF became operational in March 2010, and the UPF is currently entering its final design phase. The designs of both facilities are for meeting anticipated security challenges for the 21st century. For plutonium research, development, and manufacturing, the Chemistry and Metallurgy Research Replacement (CMRR) building at the Los Alamos National Laboratory (LANL) in Los Alamos, New Mexico is now under construction. The first phase of the CMRR Project is the design and construction of a Radiological Laboratory/Utility/Office Building. The second phase consists of the design and construction of the Nuclear Facility (NF). The National Nuclear Security Administration (NNSA) selected these two sites as part of the national plan to consolidate nuclear materials, provide for nuclear deterrence, and nonproliferation mission requirements. This work examines these two projects independent approaches to design requirements, and objectives for safeguards, security, and safety (3S) systems as well as the subsequent construction of these modern processing facilities. Emphasis is on the use of Safeguards-by-Design (SBD

  7. Design and Evaluation of Computer Generated Hologram with Binary Subwavelength Structure Designed by Deterministic Process

    NASA Astrophysics Data System (ADS)

    Oonishi, Takehito; Konishi, Tsuyoshi; Itoh, Kazuyoshi

    2007-09-01

    A binary subwavelength structure for multilevel phase modulation can be designed by our previously proposed deterministic design method without iterative optimization method. To use our design technique in various applications of a computer generated hologram (CGH) like an array illuminator, beam-shaping, signal processing, and so on, an image quality of a reconstructed image from a CGH has become much more important. In this paper, we verify the image quality of a reconstructed image from a CGH designed by our method in terms of the modulation transfer function (MTF) and the spatial resolution. Simulation results show that our technique can theoretically achieve a MTF of more than 99% over a wide range and a spatial resolution of less than 9.66μm.

  8. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  9. Waste receiving and processing plant control system; system design description

    SciTech Connect

    LANE, M.P.

    1999-02-24

    The Plant Control System (PCS) is a heterogeneous computer system composed of numerous sub-systems. The PCS represents every major computer system that is used to support operation of the Waste Receiving and Processing (WRAP) facility. This document, the System Design Description (PCS SDD), includes several chapters and appendices. Each chapter is devoted to a separate PCS sub-system. Typically, each chapter includes an overview description of the system, a list of associated documents related to operation of that system, and a detailed description of relevant system features. Each appendice provides configuration information for selected PCS sub-systems. The appendices are designed as separate sections to assist in maintaining this document due to frequent changes in system configurations. This document is intended to serve as the primary reference for configuration of PCS computer systems. The use of this document is further described in the WRAP System Configuration Management Plan, WMH-350, Section 4.1.

  10. A process for free-space laser communications system design

    NASA Astrophysics Data System (ADS)

    Walther, Frederick G.; Moores, John D.; Murphy, Robert J.; Michael, Steven; Nowak, George A.

    2009-08-01

    We present a design methodology for free-space laser communications systems. The first phase includes a characterization through numerical simulations of the channel to evaluate the range of extinction and scintillation. The second phase is the selection of fade mitigation schemes, which would incorporate pointing, acquisition, tracking, and communication system parameters specifically tailored to the channel. Ideally, the process would include sufficient flexibility to adapt to a wide range of channel conditions. We provide an example of the successful application of this design approach to a recent set of field experiments. This work was sponsored by the Department of Defense, RRCO DDR&E, under Air Force Contract FA8721-05-C-0002. Opinions, interpretations, conclusions and recommendations are those of the authors and are not necessarily endorsed by the United States Government.

  11. Design of electrochemical processes for treatment of unusual waste streams

    SciTech Connect

    Farmer, J.C.

    1998-01-01

    UCRL- JC- 129438 PREPRINT This document was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor the University of California nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or the University of California. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or the University of California, and shall not be used for advertising or product endorsement purposes. Introduction. An overview of work done on the development of three electrochemical processes that meet the specific needs of low- level waste treatment is presented. These technologies include: mediated electrochemical oxidation [I- 4]; bipolar membrane electrodialysis [5]; and electrosorption of carbon aerogel electrodes [6- 9]. Design strategies are presented to assess the suitability of these electrochemical processes for Mediated electrochemical oxidation. Mixed wastes include both hazardous and radioactive components. It is desirable to reduce the overall volume of the waste before immobilization and disposal in repositories. While incineration is an attractive technique for the destruction of organic fractions of mixed wastes, such high-temperature thermal processes pose the threat of volatilizing various radionuclides. By destroying organics in the aqueous phase at low temperature and ambient pressure, the risk of volatilization can be reduced. One approach that is

  12. Process design of press hardening with gradient material property influence

    SciTech Connect

    Neugebauer, R.; Schieck, F.; Rautenstrauch, A.

    2011-05-04

    Press hardening is currently used in the production of automotive structures that require very high strength and controlled deformation during crash tests. Press hardening can achieve significant reductions of sheet thickness at constant strength and is therefore a promising technology for the production of lightweight and energy-efficient automobiles. The manganese-boron steel 22MnB5 have been implemented in sheet press hardening owing to their excellent hot formability, high hardenability, and good temperability even at low cooling rates. However, press-hardened components have shown poor ductility and cracking at relatively small strains. A possible solution to this problem is a selective increase of steel sheet ductility by press hardening process design in areas where the component is required to deform plastically during crash tests. To this end, process designers require information about microstructure and mechanical properties as a function of the wide spectrum of cooling rates and sequences and austenitizing treatment conditions that can be encountered in production environments. In the present work, a Continuous Cooling Transformation (CCT) diagram with corresponding material properties of sheet steel 22MnB5 was determined for a wide spectrum of cooling rates. Heating and cooling programs were conducted in a quenching dilatometer. Motivated by the importance of residual elasticity in crash test performance, this property was measured using a micro-bending test and the results were integrated into the CCT diagrams to complement the hardness testing results. This information is essential for the process design of press hardening of sheet components with gradient material properties.

  13. Process design of press hardening with gradient material property influence

    NASA Astrophysics Data System (ADS)

    Neugebauer, R.; Schieck, F.; Rautenstrauch, A.

    2011-05-01

    Press hardening is currently used in the production of automotive structures that require very high strength and controlled deformation during crash tests. Press hardening can achieve significant reductions of sheet thickness at constant strength and is therefore a promising technology for the production of lightweight and energy-efficient automobiles. The manganese-boron steel 22MnB5 have been implemented in sheet press hardening owing to their excellent hot formability, high hardenability, and good temperability even at low cooling rates. However, press-hardened components have shown poor ductility and cracking at relatively small strains. A possible solution to this problem is a selective increase of steel sheet ductility by press hardening process design in areas where the component is required to deform plastically during crash tests. To this end, process designers require information about microstructure and mechanical properties as a function of the wide spectrum of cooling rates and sequences and austenitizing treatment conditions that can be encountered in production environments. In the present work, a Continuous Cooling Transformation (CCT) diagram with corresponding material properties of sheet steel 22MnB5 was determined for a wide spectrum of cooling rates. Heating and cooling programs were conducted in a quenching dilatometer. Motivated by the importance of residual elasticity in crash test performance, this property was measured using a micro-bending test and the results were integrated into the CCT diagrams to complement the hardness testing results. This information is essential for the process design of press hardening of sheet components with gradient material properties.

  14. Design and Process Considerations for a Tunneling Tip Accelerometer

    NASA Technical Reports Server (NTRS)

    Paul M. Zavracky, Bob McClelland, Keith Warner, Neil Sherman, Frank Hartley

    1995-01-01

    In this paper, we discuss issues related to the fabrication of a bulk micromachined single axis accelerometer. The accelerometer is designed to have a full scale range of ten millig and a sensitivity of tens of nanog. During the process, three distinctly different die are fabricated. These are subsequently assembled using an ally bonding technique. During the bonding operation, electrical contacts are made between layers. The accelerometer is controlled by electrostatic force plates above and below the proof mass. The lower electrode has a dual role. In operation, it provides a necessary control electrode. When not in operation, it is used to clamp the proof mass and prevents its motion. Results of the fabrication process and initial testing of the clamping function are reported.

  15. Design and programming of systolic array cells for signal processing

    SciTech Connect

    Smith, R.A.W.

    1989-01-01

    This thesis presents a new methodology for the design, simulation, and programming of systolic arrays in which the algorithms and architecture are simultaneously optimized. The algorithms determine the initial architecture, and simulation is used to optimize the architecture. The simulator provides a register-transfer level model of a complete systolic array computation. To establish the validity of this design methodology two novel programmable systolic array cells were designed and programmed. The cells were targeted for applications in high-speed signal processing and associated matrix computations. A two-chip programmable systolic array cell using a 16-bit multiplier-accumulator chip and a semi-custom VLSI controller chip was designed and fabricated. A low chip count allows large arrays to be constructed, but the cell is flexible enough to be a building-block for either one- or two-dimensional systolic arrays. Another more flexible and powerful cell using a 32-bit floating-point processor and a second VLSI controller chip was also designed. It contains several architectural features that are unique in a systolic array cell: (1) each instruction is 32 bits, yet all resources can be updated every cycle, (2) two on-chip interchangeable memories are used, and (3) one input port can be used as either a global or local port. The key issues involved in programming the cells are analyzed in detail. A set of modules is developed which can be used to construct large programs in an effective manner. The utility of this programming approach is demonstrated with several important examples.

  16. Using instructional design process to improve design and development of Internet interventions.

    PubMed

    Hilgart, Michelle M; Ritterband, Lee M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  17. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  18. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-01-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H[sub 2] mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO[sub x] (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  19. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-11-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H{sub 2} mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO{sub x} (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  20. [Design of an HACCP program for a cocoa processing facility].

    PubMed

    López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar

    2012-12-01

    The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well PMID:24020255

  1. Architecting Usability Properties in the E-Learning Instructional Design Process

    ERIC Educational Resources Information Center

    Koohang, Alex; du Plessis, Jacques

    2004-01-01

    This paper advances a framework for architecting usability properties in the e-learning instructional design process. To understand the framework for architecting usability properties into the e-learning instructional design process, the following have been defined: instructional design process, e-learning instructional design process, usability…

  2. Design, processing, and testing of lsi arrays for space station

    NASA Technical Reports Server (NTRS)

    Lile, W. R.; Hollingsworth, R. J.

    1972-01-01

    The design of a MOS 256-bit Random Access Memory (RAM) is discussed. Technological achievements comprise computer simulations that accurately predict performance; aluminum-gate COS/MOS devices including a 256-bit RAM with current sensing; and a silicon-gate process that is being used in the construction of a 256-bit RAM with voltage sensing. The Si-gate process increases speed by reducing the overlap capacitance between gate and source-drain, thus reducing the crossover capacitance and allowing shorter interconnections. The design of a Si-gate RAM, which is pin-for-pin compatible with an RCA bulk silicon COS/MOS memory (type TA 5974), is discussed in full. The Integrated Circuit Tester (ICT) is limited to dc evaluation, but the diagnostics and data collecting are under computer control. The Silicon-on-Sapphire Memory Evaluator (SOS-ME, previously called SOS Memory Exerciser) measures power supply drain and performs a minimum number of tests to establish operation of the memory devices. The Macrodata MD-100 is a microprogrammable tester which has capabilities of extensive testing at speeds up to 5 MHz. Beam-lead technology was successfully integrated with SOS technology to make a simple device with beam leads. This device and the scribing are discussed.

  3. Fiber optic sensor design for chemical process and environmental monitoring

    NASA Astrophysics Data System (ADS)

    Mahendran, R. S.; Harris, D.; Wang, L.; Machavaram, V. R.; Chen, R.; Kukureka, St. N.; Fernando, G. F.

    2007-07-01

    Cure monitoring is a term that is used to describe the cross-linking reactions in a thermosetting resin system. Advanced fiber reinforced composites are being used increasingly in a number of industrial sectors including aerospace, marine, sport, automotive and civil engineering. There is a general realization that the processing conditions that are used to manufacture the composites can have a major influence on its hot-wet mechanical properties. This paper is concerned with the design and demonstration of a number of sensor designs for in-situ cure monitoring of a model thermosetting resin system. Simple fixtures were constructed to enable a pair of cleaved optical fibers with a defined gap between the end-faces to be held in position. The resin system was introduced into this gap and the cure kinetics were followed by transmission infrared spectroscopy. A semi-empirical model was used to describe the cure process using the data obtained at different cure temperatures. The same sensor system was used to detect the ingress of moisture in the cured resin system.

  4. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  5. Simulative design and process optimization of the two-stage stretch-blow molding process

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  6. Simulative design and process optimization of the two-stage stretch-blow molding process

    SciTech Connect

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  7. Climate Monitoring Satellite Designed in a Concurrent Engineering Process

    NASA Astrophysics Data System (ADS)

    Bauer, Waldemar; Braukhane, A.; Quantius, D.; Dumont, E.; Grundmann, J. T.; Romberg, O.

    An effective method of detecting Green House Gases (GHG CO2 and CH4) is using satellites, operating in Low Earth Orbit (LEO). Satellite based greenhouse gas emissions monitoring is challenging and shows an ambitions level of requirements. Until now for corresponding scientific payload it is common to use a purpose-built satellite bus, or to install the payload on board of a larger conventional satellite. These approaches fulfils all customer requirements but could be critical from a financial point of view. Between 2014 and 2020, no space-based CH4 detection and if at all limited CO2 detection capabilities are planned internationally. In order to fill this gap the Institute for Environmental Physics (IUP) of the University of Bremen plans a GHG satellite mission with near-surface sensitivity called "CarbonSat". It shall perform synchronous global atmospheric CO2 and CH4 observations with the accuracy, precision and coverage needed to significantly advance our knowledge about the sources and sinks of Green House Gases. In order to verify technical and financial opportunities of a small satellite a Concurrent Engi-neering Study (CE-study) has been performed at DLR Bremen, Germany. To reuse knowledge in compact satellite design, the Compact/SSB (Standard Satellite Bus) was chosen as baseline design. The SSB has been developed by DLR and was already used for BIRD (Bispectral Infra-Red Detection) mission but also adapted to the ongoing missions like TET (Technologie-Erprobungs-Trüger) or AsteroidFinder. This paper deals with the highly effective design process a within the DLR-CE-Facility and with the outcomes of the CE-study. It gives an overview of the design status as well as an outlook for comparable missions.

  8. Space Shuttle Ascent Flight Design Process: Evolution and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Picka, Bret A.; Glenn, Christopher B.

    2011-01-01

    The Space Shuttle Ascent Flight Design team is responsible for defining a launch to orbit trajectory profile that satisfies all programmatic mission objectives and defines the ground and onboard reconfiguration requirements for this high-speed and demanding flight phase. This design, verification and reconfiguration process ensures that all applicable mission scenarios are enveloped within integrated vehicle and spacecraft certification constraints and criteria, and includes the design of the nominal ascent profile and trajectory profiles for both uphill and ground-to-ground aborts. The team also develops a wide array of associated training, avionics flight software verification, onboard crew and operations facility products. These key ground and onboard products provide the ultimate users and operators the necessary insight and situational awareness for trajectory dynamics, performance and event sequences, abort mode boundaries and moding, flight performance and impact predictions for launch vehicle stages for use in range safety, and flight software performance. These products also provide the necessary insight to or reconfiguration of communications and tracking systems, launch collision avoidance requirements, and day of launch crew targeting and onboard guidance, navigation and flight control updates that incorporate the final vehicle configuration and environment conditions for the mission. Over the course of the Space Shuttle Program, ascent trajectory design and mission planning has evolved in order to improve program flexibility and reduce cost, while maintaining outstanding data quality. Along the way, the team has implemented innovative solutions and technologies in order to overcome significant challenges. A number of these solutions may have applicability to future human spaceflight programs.

  9. Design of a Small Scale High Temperature Gas Loop for Process Heat Exchanger Design Tests

    SciTech Connect

    SungDeok, Hong; DongSeok, Oh; WonJae, Lee; JongHwa, Chang

    2006-07-01

    We designed a small scale gas loop that can simulate reference operating conditions, that is, a temperature up to 950 deg C and a pressure up to 6 MPa. Main objective of the loop is to screen the candidate process-heat-exchanger designs of a very small capacity of 10 {approx} 20 kW. We arranged the components of a primary gas loop and a secondary SO{sub 3} loop. Design requirements are prepared for the safe design of a main heater, a hot-gas-duct and a process heat exchanger that avoid a risk of a failure owing to thermal stresses, a flow-induced vibration or an acoustic vibration in both nitrogen and helium mediums. In the primary and secondary loops, the hot-gas-ducts are internally insulated by a ceramic fiber insulation material to protect the pressure housing from high gas temperatures. We determined a total pressure loss of the primary loop to be 66 kPa and the minimum outer diameter of the loop pressure pipe to be 90 mm at a hot location that will prevent a thermal failure. Very toxic SO{sub 3} secondary loop is needed a scrubber and a SO{sub 3} collector for safety and preventing a contamination of the environment. (authors)

  10. Preconceptual design of a salt splitting process using ceramic membranes

    SciTech Connect

    Kurath, D.E.; Brooks, K.P.; Hollenberg, G.W.; Clemmer, R.; Balagopal, S.; Landro, T.; Sutija, D.P.

    1997-01-01

    Inorganic ceramic membranes for salt splitting of radioactively contaminated sodium salt solutions are being developed for treating U. S. Department of Energy tank wastes. The process consists of electrochemical separation of sodium ions from the salt solution using sodium (Na) Super Ion Conductors (NaSICON) membranes. The primary NaSICON compositions being investigated are based on rare- earth ions (RE-NaSICON). Potential applications include: caustic recycling for sludge leaching, regenerating ion exchange resins, inhibiting corrosion in carbon-steel tanks, or retrieving tank wastes; reducing the volume of low-level wastes volume to be disposed of; adjusting pH and reducing competing cations to enhance cesium ion exchange processes; reducing sodium in high-level-waste sludges; and removing sodium from acidic wastes to facilitate calcining. These applications encompass wastes stored at the Hanford, Savannah River, and Idaho National Engineering Laboratory sites. The overall project objective is to supply a salt splitting process unit that impacts the waste treatment and disposal flowsheets and meets user requirements. The potential flowsheet impacts include improving the efficiency of the waste pretreatment processes, reducing volume, and increasing the quality of the final waste disposal forms. Meeting user requirements implies developing the technology to the point where it is available as standard equipment with predictable and reliable performance. This report presents two preconceptual designs for a full-scale salt splitting process based on the RE-NaSICON membranes to distinguish critical items for testing and to provide a vision that site users can evaluate.

  11. Materials, design and processing of air encapsulated MEMS packaging

    NASA Astrophysics Data System (ADS)

    Fritz, Nathan T.

    This work uses a three-dimensional air cavity technology to improve the fabrication, and functionality of microelectronics devices, performance of on-board transmission lines, and packaging of micro-electromechanical systems (MEMS). The air cavity process makes use of the decomposition of a patterned sacrificial polymer followed by the diffusion of its by-products through a curing polymer overcoat to obtain the embedded air structure. Applications and research of air cavities have focused on simple designs that concentrate on the size and functionality of the particular device. However, a lack of guidelines for fabrication, materials used, and structural design has led to mechanical stability issues and processing refinements. This work investigates improved air gap cavities for use in MEMS packaging processes, resulting in fewer fabrication flaws and lower cost. The identification of new materials, such as novel photo-definable organic/inorganic hybrid polymers, was studied for increased strength and rigidity due to their glass-like structure. A novel epoxy polyhedral oligomeric silsesquioxane (POSS) material was investigated and characterized for use as a photodefineable, permanent dielectrics with improved mechanical properties. The POSS material improved the air gap fabrication because it served as a high-selectivity etch mask for patterning sacrificial materials as well as a cavity overcoat material with improved rigidity. An investigation of overcoat thickness and decomposition kinetics provided a fundamental understanding of the properties that impart mechanical stability to cavities of different shape and volume. Metallization of the cavities was investigated so as to provide hermetic sealing and improved cavity strength. The improved air cavity, wafer-level packages were tested using resonator-type devices and chip-level lead frame packaging. The air cavity package was molded under traditional lead frame molding pressures and tested for mechanical

  12. From Safe Nanomanufacturing to Nanosafe-by-Design processes

    NASA Astrophysics Data System (ADS)

    Schuster, F.; Lomello, F.

    2013-04-01

    Industrial needs in terms of multifunctional components are increasing. Many sectors are concerned, from the integrated direct nanoparticles production to the emerging combinations which include the metal matrix composites (MMC), ductile ceramics and ceramic matrix composites, polymer matrix composites (PMC) for bulk application and advanced surface coatings in the fields of automotive, aerospace, energy production and building applications. Moreover, domains with a planetary impact such as environmental issues, as well as aspects for instance health (toxicity) and hazard assessment (ignition and explosion severity) were also taken into account. Nanotechnologies play an important role in promoting innovation in design and realization of multifunctional products for the future, either by improving usual products or creating new functions and/or new products. Nevertheless, this huge evolution in terms of materials could only be promoted by increasing the social acceptance and by acting on the different main technological and economic challenges and developing safe oriented processes. Nowadays, a huge number of developments of nanoparticles are potentially industrial up-scalable. However, some doubts exist about the handling's safety of the current technologies. For these reasons, the main purpose was to develop a self-monitored automation in the production line coupling different techniques in order to simplify processes such as in-situ growth nanoparticles into a nanostructured matrix, over different substrates and/or the nanopowders synthesis, functionalization, dry or wet safe recovery system, granulation, consolidation in single-step, by monitoring at real time the processing parameters such as powder stoichiometry. With the aim of assuring the traceability of the product during the whole life, starting from the conception and including the R&D, the distribution and the use were also considered. The optimization in terms of processing, recovery and conditioning

  13. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  14. Data Quality Objectives Process for Designation of K Basins Debris

    SciTech Connect

    WESTCOTT, J.L.

    2000-05-22

    The U.S. Department of Energy has developed a schedule and approach for the removal of spent fuels, sludge, and debris from the K East (KE) and K West (KW) Basins, located in the 100 Area at the Hanford Site. The project that is the subject of this data quality objective (DQO) process is focused on the removal of debris from the K Basins and onsite disposal of the debris at the Environmental Restoration Disposal Facility (ERDF). This material previously has been dispositioned at the Hanford Low-Level Burial Grounds (LLBGs) or Central Waste Complex (CWC). The goal of this DQO process and the resulting Sampling and Analysis Plan (SAP) is to provide the strategy for characterizing and designating the K-Basin debris to determine if it meets the Environmental Restoration Disposal Facility Waste Acceptance Criteria (WAC), Revision 3 (BHI 1998). A critical part of the DQO process is to agree on regulatory and WAC interpretation, to support preparation of the DQO workbook and SAP.

  15. On the optimal design of the disassembly and recovery processes.

    PubMed

    Xanthopoulos, A; Iakovou, E

    2009-05-01

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study. PMID:19138507

  16. Superior metallic alloys through rapid solidification processing (RSP) by design

    SciTech Connect

    Flinn, J.E.

    1995-05-01

    Rapid solidification processing using powder atomization methods and the control of minor elements such as oxygen, nitrogen, and carbon can provide metallic alloys with superior properties and performance compared to conventionally processing alloys. Previous studies on nickel- and iron-base superalloys have provided the baseline information to properly couple RSP with alloy composition, and, therefore, enable alloys to be designed for performance improvements. The RSP approach produces powders, which need to be consolidated into suitable monolithic forms. This normally involves canning, consolidation, and decanning of the powders. Canning/decanning is expensive and raises the fabrication cost significantly above that of conventional, ingot metallurgy production methods. The cost differential can be offset by the superior performance of the RSP metallic alloys. However, without the performance database, it is difficult to convince potential users to adopt the RSP approach. Spray casting of the atomized molten droplets into suitable preforms for subsequent fabrication can be cost competitive with conventional processing. If the fine and stable microstructural features observed for the RSP approach are preserved during spray casing, a cost competitive product can be obtained that has superior properties and performance that cannot be obtained by conventional methods.

  17. Tools for efficient design of multicomponent separation processes

    NASA Astrophysics Data System (ADS)

    Huff, Joshua Lee

    formulation and the relative effect of capital and operating cost is weighed for an example feed. Previous methods based on Underwood's equations have no accounting for the temperature at which utilities are required. To account for this, a thermodynamic efficiency function is developed which allows the complete search space to be ranklisted in order of the exergy loss occurring within the configuration. Examining these results shows that this objective function favors configurations which move their reboiler and condenser duties to milder temperature exchangers. A graphical interface is presented which allows interpretation of any of the above results in a quick and intuitive fashion, complete with system flow and composition data and the ability to filter the complete search space based on numerical and structural criteria. This provides a unique way to compare and contrast configurations as well as allowing considerations like column retrofit and maximum controllability to be considered. Using all five of these screening techniques, the traditional intuition-based methods of separations process design can be augmented with analytical and algorithmic tools which enable selection of a process design with low cost and high efficiency.

  18. Process and Prospects for the Designed Hydrograph, Lower Missouri River

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Galat, D. L.; Hay, C. H.

    2005-05-01

    The flow regime of the Lower Missouri River (LMOR, Gavins Point, SD to St. Louis, MO) is being redesigned to restore elements of natural variability while maintaining project purposes such as power production, flood control, water supply, and navigation. Presently, an experimental hydrograph alteration is planned for Spring, 2006. Similar to many large, multi-purpose rivers, the ongoing design process involves negotiation among many management and stakeholder groups. The negotiated process has simplified the hydrograph into two key elements -- the spring rise and the summer low - with emphasis on the influence of these elements on three threatened or endangered species. The spring rise has been hypothesized to perform three functions: build sandbars for nesting of the interior least tern and piping plover, provide episodic connectivity with low-lying flood plain, and provide a behavioral spawning cue for the pallid sturgeon. Among these, most emphasis has been placed on the spawning cue because concerns about downstream flood hazards have limited flow magnitudes to those that are thought to be geomorphically ineffective, and channelization and incision provide little opportunity for moderate flows to connect to the flood plain. Our analysis of the natural hydrologic regime provides some insight into possible spring rise design elements, including timing, rate of rise and fall, and length of spring flow pulses. The summer low has been hypothesized to emerge sandbars for nesting and to maximize area of shallow, slow water for rearing of larval and juvenile fish. Re-engineering of the navigation channel to provide greater diversity of habitat during navigation flows has been offered as an alternative to the summer low. Our analysis indicates that re-engineering has potential to increase habitat availability substantially, but the ecological results are so-far unknown. The designed hydrograph that emerges from the multi-objective process will likely represent a

  19. Conceptual Design for the Pilot-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Jones, Susan A.; Rapko, Brian M.

    2014-08-05

    This report describes a conceptual design for a pilot-scale capability to produce plutonium oxide for use as exercise and reference materials, and for use in identifying and validating nuclear forensics signatures associated with plutonium production. This capability is referred to as the Pilot-scale Plutonium oxide Processing Unit (P3U), and it will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including plutonium dioxide (PuO2) dissolution, purification of the Pu by ion exchange, precipitation, and conversion to oxide by calcination.

  20. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use

  1. fMRI paradigm designing and post-processing tools.

    PubMed

    James, Jija S; Rajesh, Pg; Chandran, Anuvitha Vs; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  2. fMRI paradigm designing and post-processing tools

    PubMed Central

    James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  3. An advanced microcomputer design for processing of semiconductor materials

    NASA Technical Reports Server (NTRS)

    Bjoern, L.; Lindkvist, L.; Zaar, J.

    1988-01-01

    In the Get Away Special 330 payload two germanium samples doped with gallium will be processed. The aim of the experiments is to create a planar solid/liquid interface, and to study the breakdown of this interface as the crystal growth rate increases. For the experiments a gradient furnace was designed which is heated by resistive heaters. Cooling is provided by circulating gas from the atmosphere in the cannister through cooling channels in the furnace. The temperature along the sample are measured by platinum/rhodium thermocouples. The furnace is controlled by a microcomputer system, based upon the processor 80C88. A data acquisition system is integrated into the system. In order to synchronize the different actions in time, a multitask manager is used.

  4. Hairy root culture: bioreactor design and process intensification.

    PubMed

    Stiles, Amanda R; Liu, Chun-Zhao

    2013-01-01

    The cultivation of hairy roots for the production of secondary metabolites offers numerous advantages; hairy roots have a fast growth rate, are genetically stable, and are relatively simple to maintain in phytohormone free media. Hairy roots provide a continuous source of secondary metabolites, and are useful for the production of chemicals for pharmaceuticals, cosmetics, and food additives. In order for hairy roots to be utilized on a commercial scale, it is necessary to scale-up their production. Over the last several decades, significant research has been conducted on the cultivation of hairy roots in various types of bioreactor systems. In this review, we discuss the advantages and disadvantages of various bioreactor systems, the major factors related to large-scale bioreactor cultures, process intensification technologies and overview the mathematical models and computer-aided methods that have been utilized for bioreactor design and development. PMID:23604206

  5. A Design Verification of the Parallel Pipelined Image Processings

    NASA Astrophysics Data System (ADS)

    Wasaki, Katsumi; Harai, Toshiaki

    2008-11-01

    This paper presents a case study of the design and verification of a parallel and pipe-lined image processing unit based on an extended Petri net, which is called a Logical Colored Petri net (LCPN). This is suitable for Flexible-Manufacturing System (FMS) modeling and discussion of structural properties. LCPN is another family of colored place/transition-net(CPN) with the addition of the following features: integer value assignment of marks, representation of firing conditions as marks' value based formulae, and coupling of output procedures with transition firing. Therefore, to study the behavior of a system modeled with this net, we provide a means of searching the reachability tree for markings.

  6. Identifying User Needs and the Participative Design Process

    NASA Astrophysics Data System (ADS)

    Meiland, Franka; Dröes, Rose-Marie; Sävenstedt, Stefan; Bergvall-Kåreborn, Birgitta; Andersson, Anna-Lena

    As the number of persons with dementia increases and also the demands on care and support at home, additional solutions to support persons with dementia are needed. The COGKNOW project aims to develop an integrated, user-driven cognitive prosthetic device to help persons with dementia. The project focuses on support in the areas of memory, social contact, daily living activities and feelings of safety. The design process is user-participatory and consists of iterative cycles at three test sites across Europe. In the first cycle persons with dementia and their carers (n = 17) actively participated in the developmental process. Based on their priorities of needs and solutions, on their disabilities and after discussion between the team, a top four list of Information and Communication Technology (ICT) solutions was made and now serves as the basis for development: in the area of remembering - day and time orientation support, find mobile service and reminding service, in the area of social contact - telephone support by picture dialling, in the area of daily activities - media control support through a music playback and radio function, and finally, in the area of safety - a warning service to indicate when the front door is open and an emergency contact service to enhance feelings of safety. The results of this first project phase show that, in general, the people with mild dementia as well as their carers were able to express and prioritize their (unmet) needs, and the kind of technological assistance they preferred in the selected areas. In next phases it will be tested if the user-participatory design and multidisciplinary approach employed in the COGKNOW project result in a user-friendly, useful device that positively impacts the autonomy and quality of life of persons with dementia and their carers.

  7. Design Process of an Area-Efficient Photobioreactor

    PubMed Central

    Janssen, Marcel; Tramper, Johannes; Wijffels, René H.

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved. PMID:18266033

  8. Design process of an area-efficient photobioreactor.

    PubMed

    Zijffers, Jan-Willem F; Janssen, Marcel; Tramper, Johannes; Wijffels, René H

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved. PMID:18266033

  9. Design of dual working electrodes for concentration process in metalloimmunoassay.

    PubMed

    Hori, Nobuyasu; Chikae, Miyuki; Kirimura, Hiroya; Takamura, Yuzuru

    2016-10-01

    Electrochemical immunosensing, particularly through a metalloimmunoassay, is a promising approach for development of point-of-care (POC) diagnostics devices. This study investigated the structure of dual working electrodes (W1 and W2), used in a silver nanoparticles-labeled sandwich-type immunoassay and silver concentration process, paying special attention to the position of W1 relative to W2. The new structures of the dual working electrodes were fabricated for efficient silver concentration and evaluated experimentally, which showed that the duration of prereduction before current measurement decreased from 480 s to 300 s by transforming the position of W1 from 1 line to 2 lines or 6 parts. The experimental results were also compared with numerical simulations based on three-dimensional diffusion, and the prereduction step almost followed the three-dimensional diffusion equation. Using numerical simulations, the ideal structures of dual working electrodes were designed based on relationships between the structures and duration of prereduction or the LOD. In the case of 36 lines at an area ratio of W1 to W1 + W2 of 1 to 10, the prereduction duration decreased to 96 s. The dual working electrodes designed in this study promise to shorten the total analysis time and lower the LOD for POC diagnostics. PMID:27572238

  10. Process and reactor design for biophotolytic hydrogen production.

    PubMed

    Tamburic, Bojan; Dechatiwongse, Pongsathorn; Zemichael, Fessehaye W; Maitland, Geoffrey C; Hellgardt, Klaus

    2013-07-14

    The green alga Chlamydomonas reinhardtii has the ability to produce molecular hydrogen (H2), a clean and renewable fuel, through the biophotolysis of water under sulphur-deprived anaerobic conditions. The aim of this study was to advance the development of a practical and scalable biophotolytic H2 production process. Experiments were carried out using a purpose-built flat-plate photobioreactor, designed to facilitate green algal H2 production at the laboratory scale and equipped with a membrane-inlet mass spectrometry system to accurately measure H2 production rates in real time. The nutrient control method of sulphur deprivation was used to achieve spontaneous H2 production following algal growth. Sulphur dilution and sulphur feed techniques were used to extend algal lifetime in order to increase the duration of H2 production. The sulphur dilution technique proved effective at encouraging cyclic H2 production, resulting in alternating Chlamydomonas reinhardtii recovery and H2 production stages. The sulphur feed technique enabled photobioreactor operation in chemostat mode, resulting in a small improvement in H2 production duration. A conceptual design for a large-scale photobioreactor was proposed based on these experimental results. This photobioreactor has the capacity to enable continuous and economical H2 and biomass production using green algae. The success of these complementary approaches demonstrate that engineering advances can lead to improvements in the scalability and affordability of biophotolytic H2 production, giving increased confidence that H2 can fulfil its potential as a sustainable fuel of the future. PMID:23689756

  11. Optimum Design Of Addendum Surfaces In Sheet Metal Forming Process

    NASA Astrophysics Data System (ADS)

    Debray, K.; Sun, Z. C.; Radjai, R.; Guo, Y. Q.; Dai, L.; Gu, Y. X.

    2004-06-01

    The design of addendum surfaces in sheet forming process is very important for the product quality, but it is very time-consuming and needs tedious trial-error corrections. In this paper, we propose a methodology to automatically generate the addendum surfaces and then to optimize them using a forming modelling solver. The surfaces' parameters are taken as design variables and modified in course of optimization. The finite element mesh is created on the initial addendum surfaces and mapped onto the modified surfaces without remeshing operation. The Feasible Sequential Quadratic Programming (FSQP) is adopted as our algorithm of optimization. Two objective functions are used: the first one is the thickness function to minimize the thickness variation on the workpiece ; the second one is the appearance function aiming to avoid the scratching defects on the external surfaces of panels. The FSQP is combined with our "Inverse Approach" or "One Step Approach" which is a very fast forming solver. This leads to a very efficient optimization procedure. The present methodology is applied to a square box. The addendum surfaces are characterised by four geometrical variables. The influence of optimization criteria is studied and discussed.

  12. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  13. MIMO variable structure controller design for a bioreactor benchmark process.

    PubMed

    Efe, M O

    2007-10-01

    In this paper, variable structure control of a bioreactor is studied. The process has two state variables named cell mass and nutrient amount, and two control inputs to maintain the state variables at their desired levels. Although the state space representation of the system seems simple, the system displays several challenges that make it necessary to develop a good flowrate (control) management strategy. Due to the plant-model mismatch, variable structure control technique is applied and it is seen that the sliding subspace is reached in finite time and the behavior thereafter is insensitive to considerable degrees of variation in the parameters and disturbances. The design is based on the nominal model and a comparison with a feedback linearizing controller is presented. The objective of the paper is to illustrate the efficacy of MIMO sliding mode control on a benchmark problem. Overall, the results with the proposed controller demonstrate the following desirable characteristics: (i) very good tracking precision (ii) small percent overshoot values and (iii) good decoupling of the process states. PMID:17521653

  14. HYBRID SULFUR PROCESS REFERENCE DESIGN AND COST ANALYSIS

    SciTech Connect

    Gorensek, M.; Summers, W.; Boltrunis, C.; Lahoda, E.; Allen, D.; Greyvenstein, R.

    2009-05-12

    This report documents a detailed study to determine the expected efficiency and product costs for producing hydrogen via water-splitting using energy from an advanced nuclear reactor. It was determined that the overall efficiency from nuclear heat to hydrogen is high, and the cost of hydrogen is competitive under a high energy cost scenario. It would require over 40% more nuclear energy to generate an equivalent amount of hydrogen using conventional water-cooled nuclear reactors combined with water electrolysis compared to the proposed plant design described herein. There is a great deal of interest worldwide in reducing dependence on fossil fuels, while also minimizing the impact of the energy sector on global climate change. One potential opportunity to contribute to this effort is to replace the use of fossil fuels for hydrogen production by the use of water-splitting powered by nuclear energy. Hydrogen production is required for fertilizer (e.g. ammonia) production, oil refining, synfuels production, and other important industrial applications. It is typically produced by reacting natural gas, naphtha or coal with steam, which consumes significant amounts of energy and produces carbon dioxide as a byproduct. In the future, hydrogen could also be used as a transportation fuel, replacing petroleum. New processes are being developed that would permit hydrogen to be produced from water using only heat or a combination of heat and electricity produced by advanced, high temperature nuclear reactors. The U.S. Department of Energy (DOE) is developing these processes under a program known as the Nuclear Hydrogen Initiative (NHI). The Republic of South Africa (RSA) also is interested in developing advanced high temperature nuclear reactors and related chemical processes that could produce hydrogen fuel via water-splitting. This report focuses on the analysis of a nuclear hydrogen production system that combines the Pebble Bed Modular Reactor (PBMR), under development by

  15. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  16. (New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries)

    SciTech Connect

    Not Available

    1991-01-01

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  17. [New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries

    SciTech Connect

    Not Available

    1991-12-31

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  18. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  19. Universal Design in Postsecondary Education: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design, "is the…

  20. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false How does the NEPA process relate to the design-build procurement process? 636.109 Section 636.109 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.109 How does the NEPA process relate to the...

  1. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  2. The design of a distributed image processing and dissemination system

    SciTech Connect

    Rafferty, P.; Hower, L.

    1990-01-01

    The design and implementation of a distributed image processing and dissemination system was undertaken and accomplished as part of a prototype communication and intelligence (CI) system, the contingency support system (CSS), which is intended to support contingency operations of the Tactical Air Command. The system consists of six (6) Sun 3/180C workstations with integrated ITEX image processors and three (3) 3/50 diskless workstations located at four (4) system nodes (INEL, base, and mobiles). All 3/180C workstations are capable of image system server functions where as the 3/50s are image system clients only. Distribution is accomplished via both local and wide area networks using standard Defense Data Network (DDN) protocols (i.e., TCP/IP, et al.) and Defense Satellite Communication Systems (DSCS) compatible SHF Transportable Satellite Earth Terminals (TSET). Image applications utilize Sun's Remote Procedure Call (RPC) to facilitate the image system client and server relationships. The system provides functions to acquire, display, annotate, process, transfer, and manage images via an icon, panel, and menu oriented Sunview{trademark} based user interface. Image spatial resolution is 512 {times} 480 with 8-bits/pixel black and white and 12/24 bits/pixel color depending on system configuration. Compression is used during various image display and transmission functions to reduce the dynamic range of image data of 12/6/3/2 bits/pixel depending on the application. Image acquisition is accomplished in real-time or near-real-time by special purpose Itex image hardware. As a result all image displays are highly interactive with attention given to subsecond response time. 3 refs., 7 figs.

  3. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  4. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  5. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  6. An acceptable role for computers in the aircraft design process

    NASA Technical Reports Server (NTRS)

    Gregory, T. J.; Roberts, L.

    1980-01-01

    Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.

  7. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  8. Process of videotape making: presentation design, software, and hardware

    NASA Astrophysics Data System (ADS)

    Dickinson, Robert R.; Brady, Dan R.; Bennison, Tim; Burns, Thomas; Pines, Sheldon

    1991-06-01

    The use of technical video tape presentations for communicating abstractions of complex data is now becoming commonplace. While the use of video tapes in the day-to-day work of scientists and engineers is still in its infancy, their use as applications oriented conferences is now growing rapidly. Despite these advancements, there is still very little that is written down about the process of making technical videotapes. For printed media, different presentation styles are well known for categories such as results reports, executive summary reports, and technical papers and articles. In this paper, the authors present ideas on the topic of technical videotape presentation design in a format that is worth referring to. They have started to document the ways in which the experience of media specialist, teaching professionals, and character animators can be applied to scientific animation. Software and hardware considerations are also discussed. For this portion, distinctions are drawn between the software and hardware required for computer animation (frame at a time) productions, and live recorded interaction with a computer graphics display.

  9. Design of the Laboratory-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Orton, Robert D.; Rapko, Brian M.; Smart, John E.

    2015-05-01

    This report describes a design for a laboratory-scale capability to produce plutonium oxide (PuO2) for use in identifying and validating nuclear forensics signatures associated with plutonium production, as well as for use as exercise and reference materials. This capability will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including PuO2 dissolution, purification of the Pu by ion exchange, precipitation, and re-conversion to PuO2 by calcination.

  10. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  11. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  12. The Changing Metropolitan Designation Process and Rural America

    ERIC Educational Resources Information Center

    Slifkin, Rebecca T.; Randolph, Randy; Ricketts, Thomas C.

    2004-01-01

    In June 2003, the Office of Management and Budget (OMB) released new county-based designations of Core Based Statistical Areas (CBSAs), replacing Metropolitan Statistical Area designations that were last revised in 1990. In this article, the new designations are briefly described, and counties that have changed classifications are identified.…

  13. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  14. Impact of gin saw tooth design on textile processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Toothed gin saws have been used to separate cotton fiber from the seed for over 200 years. There have been many saw tooth designs developed over the years. Most of these designs were developed by trial and error. A complete and scientific analysis of tooth design has never been done. It is not k...

  15. Reliability and the design process at Honeywell Avionics Division

    NASA Technical Reports Server (NTRS)

    Bezat, A.

    1981-01-01

    The division's philosophy for designed-in reliability and a comparison of reliability programs for space, manned military aircraft, and commercial aircraft, are presented. Topics include: the reliability interface with design and production; the concept phase through final proposal; the design, development, test and evaluation phase; the production phase; and the commonality among space, military, and commercial avionics.

  16. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  17. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  18. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  19. Using GREENSCOPE Indicators for Sustainable Computer-Aided Process Evaluation and Design

    EPA Science Inventory

    Manufacturing sustainability can be increased by educating those who design, construct, and operate facilities, and by using appropriate tools for process evaluation and design. The U.S. Environmental Protection Agency's GREENSCOPE methodology and tool, for evaluation and design ...

  20. Direct selective laser sintering of high performance metals: Machine design, process development and process control

    NASA Astrophysics Data System (ADS)

    Das, Suman

    1998-11-01

    development of machine, processing and control technologies during this research effort enabled successful production of a number of integrally canned test specimens in Alloy 625 (InconelRTM 625 superalloy) and Ti-6Al-4V alloy. The overall goal of this research was to develop direct SLS of metals armed with a fundamental understanding of the underlying physics. The knowledge gained from experimental and analytical work is essential for three key objectives: machine design, process development and process control. (Abstract shortened by UMI.)

  1. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  2. The Integrated Design Process from the Facilitator's Perspective

    ERIC Educational Resources Information Center

    Lee, Jeehyun

    2014-01-01

    The focus of this study was to clarify the integrated design process from an educational standpoint, and identify its influencing factors and the role of facilitator. Through a literature review, the integrated design process and the role of facilitator were framed, and through the case study, the whole process of integrated design and the…

  3. Collaborative Course Design: Changing the Process, Acknowledging the Context, and Implications for Academic Development

    ERIC Educational Resources Information Center

    Ziegenfuss, Donna Harp; Lawler, Patricia A.

    2008-01-01

    This research study describes the experiences and perceptions of an instructor and an instructional design specialist who collaborated on the design and implementation of a university course using a new course design process. Findings uncovered differences between an informal collaboration process and the adaptation of that process for…

  4. Development of Integrated Programs for Aerospace-vehicle Design (IPAD): Product manufacture interactions with the design process

    NASA Technical Reports Server (NTRS)

    Crowell, H. A.

    1979-01-01

    The product manufacturing interactions with the design process and the IPAD requirements to support the interactions are described. The data requirements supplied to manufacturing by design are identified and quantified. Trends in computer-aided manufacturing are discussed and the manufacturing process of the 1980's is anticipated.

  5. Integrating optical fabrication and metrology into the optical design process

    NASA Astrophysics Data System (ADS)

    Harvey, James E.

    2014-12-01

    Image degradation due to scattered radiation from residual optical fabrication errors is a serious problem in many short wavelength (X-ray/EUV) imaging systems. Most commercially-available image analysis codes (ZEMAX, Code V, ASAP, FRED, etc.) currently require the scatter behavior (BSDF data) to be provided as input in order to calculate the image quality of such systems. This BSDF data is difficult to measure and rarely available for the operational wavelengths of interest. Since the smooth-surface approximation is often not satisfied at these short wavelengths, the classical Rayleigh-Rice expression that indicates the BRDF is directly proportional to the surface PSD cannot be used to calculate BRDFs from surface metrology data for even slightly rough surfaces. However, an FFTLog numerical Hankel transform algorithm enables the practical use of the computationally intensive Generalized Harvey-Shack (GHS) surface scatter theory [1] to calculate BRDFs from surface PSDs for increasingly short wavelengths that violate the smooth surface approximation implicit in the Rayleigh-Rice surface scatter theory [2-3]. The recent numerical validation [4] of the GHS theory (a generalized linear systems formulation of surface scatter theory), and an analysis of image degradation due to surface scatter in the presence of aberrations [5] has provided credence to the development of a systems engineering analysis of image quality as degraded not only by diffraction effects and geometrical aberrations, but to scattering effects due to residual optical fabrication errors as well. These advances, combined with the continuing increase in computer speed, leave us poised to fully integrate optical metrology and fabrication into the optical design process.

  6. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  7. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  8. Design Ideas, Reflection, and Professional Identity: How Graduate Students Explore the Idea Generation Process

    ERIC Educational Resources Information Center

    Hutchinson, Alisa; Tracey, Monica W.

    2015-01-01

    Within design thinking, designers are responsible for generating, testing, and refining design ideas as a means to refine the design problem and arrive at an effective solution. Thus, understanding one's individual idea generation experiences and processes can be seen as a component of professional identity for designers, which involves the…

  9. An Examination of the Decision-Making Process Used by Designers in Multiple Disciplines

    ERIC Educational Resources Information Center

    Stefaniak, Jill E.; Tracey, Monica W.

    2014-01-01

    Design-thinking is an inductive and participatory process in which designers are required to manage constraints, generate solutions, and follow project timelines in order to complete project goals. The researchers used this exploration study to look at how designers in various disciplinary fields approach design projects. Designers were asked to…

  10. Role of Graphics Tools in the Learning Design Process

    ERIC Educational Resources Information Center

    Laisney, Patrice; Brandt-Pomares, Pascale

    2015-01-01

    This paper discusses the design activities of students in secondary school in France. Graphics tools are now part of the capacity of design professionals. It is therefore apt to reflect on their integration into the technological education. Has the use of intermediate graphical tools changed students' performance, and if so in what direction,…

  11. Student Evaluation of CALL Tools during the Design Process

    ERIC Educational Resources Information Center

    Nesbitt, Dallas

    2013-01-01

    This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…

  12. Authenticity in the Process of Learning about Instructional Design

    ERIC Educational Resources Information Center

    Wilson, Jay R.; Schwier, Richard A.

    2009-01-01

    Authentic learning is touted as a powerful learning approach, particularly in the context of problem-based learning (Savery, 2006). Teaching and learning in the area of instructional design appears to offer a strong fit between the tenets of authentic learning and the practice of instructional design. This paper details the efforts to broaden and…

  13. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  14. Incorporating Academic Standards in Instructional Systems Design Process.

    ERIC Educational Resources Information Center

    Wang, Charles Xiaoxue

    Almost every state is "imposing" academic standards. Helping students to meet those standards is a key task for teachers and school administrators, as well as instructional systems designers. Thus, instructional designers in the K-12 environments are facing the challenge of using appropriately and effectively academic standards in their…

  15. Development of Chemical Process Design and Control for Sustainability

    EPA Science Inventory

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy....

  16. Cooperation System Design for the XMDR-Based Business Process

    NASA Astrophysics Data System (ADS)

    Moon, Seokjae; Jung, Gyedong; Hwang, Chigon; Choi, Youngkeun

    This paper proposes a cooperation system for the XMDR-based business process. The proposed system solves the problem of heterogeneousness that may take place regarding interoperability of queries in a XMDR-based business process. Heterogeneousness in an operation of a business process may involve metadata collision, schema collision, or data collision. This can be handled by operating a business process by making use of XMDR-based Global Query and Local Query.

  17. Innovation Process Design: A Change Management and Innovation Dimension Perspective

    NASA Astrophysics Data System (ADS)

    Peisl, Thomas; Reger, Veronika; Schmied, Juergen

    The authors propose an innovative approach to the management of innovation integrating business, process, and maturity dimensions. Core element of the concept is the adaptation of ISO/IEC 15504 to the innovation process including 14 innovation drivers. Two managerial models are applied to conceptualize and visualize the respective innovation strategies, the Balanced Scorecard and a Barriers in Change Processes Model. An illustrative case study shows a practical implementation process.

  18. PROCESS DESIGN MANUAL FOR LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The USEPA guidance on land treatment of municipal and industrial wastewater is updated for the first time since 1984. The significant new technilogical changes include phytoremediation, vadose zone monitoring, new design approaches to surface irrigation, center pivot irrigation,...

  19. DESIGN PROCEDURES FOR DISSOLVED OXYGEN CONTROL OF ACTIVATED SLUDGE PROCESSES

    EPA Science Inventory

    This report presents design procedures and guidelines for the selection of aeration equipment and dissolved (DO) control systems for activated sludge treatment plants. Aeration methods, equipment and application techniques are examined and selection procedures offered. Various DO...

  20. DESIGNING SUSTAINABLE PROCESSES WITH SIMULATION: THE WASTE REDUCTION (WAR) ALGORITHM

    EPA Science Inventory

    The WAR Algorithm, a methodology for determining the potential environmental impact (PEI) of a chemical process, is presented with modifications that account for the PEI of the energy consumed within that process. From this theory, four PEI indexes are used to evaluate the envir...

  1. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  2. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  3. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  4. Design and acquisition process for the Multimegawatt Terrestrial Power Plant

    SciTech Connect

    Gertz, C.P.; Bowman, A.L.

    1986-01-01

    In August 1984, the USAF entered into a joint agreement with the US Department of Energy (DOE) to consider with the US Department of Energy (DOE) to consider nuclear energy as an alternative to meet secure power requirements. With USAF approval, the DOE began development of the Multimegawatt Terrestrial Power Plant (MTPP), a nuclear reactor power plant capable of providing safe, sustained, secure, reliable, and economic power to key mission facilities. The MTPP would provide base-load electricity and/or thermal energy as well as energy to key mission facilities. The plant would operate for extended time periods (months or years) without refueling and be independent of off-site support. The MTPP is to be designed to the most modern standards and is to meet USAF and DOE design requirements. Except for a few unique features that assure the plant is capable of supporting military missions, the design would meet licensing criteria. A few of the major design criteria are described. Procurement, and design efforts are also discussed.

  5. Design and Implementation of Process Migrating among Multiple Virtual Machines

    NASA Astrophysics Data System (ADS)

    Shen, Si; Zhang, Zexian; Yang, Shuangxi; Guo, Ruilin; Jiang, Murong

    Process migrating technology usually is used to solve the problems like user process death, system crash or lower executing efficiency because of the load unbalancing among the multi-processors. Virtual machine can supply system level backup and migration. But it is too much overhead sometimes. In this paper, a process migration technology on program level is put forward and a demo program has been developed for validation. It possesses high performance, low cost and pertinence. Aiming at the information involved in process migration, obtain process data from JVM by calling Java JDI API, and transmit them to the node having idle computing resources. This technology is platform-independent, and the efficiency of distributed system would be enhanced with it. It also has the advantages such as strong commonality, protecting local environment from intrusion, and preventing from malicious code filching local information.

  6. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  7. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) The manufacturer shall not change the design, material, manufacturing process, or construction of a... requests for acceptance of revisions in design, material, manufacturing process, or construction of a non... 46 Shipping 6 2010-10-01 2010-10-01 false Procedure for acceptance of revisions of design,...

  8. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) The manufacturer shall not change the design, material, manufacturing process, or construction of a... requests for acceptance of revisions in design, material, manufacturing process, or construction of a non... 46 Shipping 6 2012-10-01 2012-10-01 false Procedure for acceptance of revisions of design,...

  9. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) The manufacturer shall not change the design, material, manufacturing process, or construction of a... requests for acceptance of revisions in design, material, manufacturing process, or construction of a non... 46 Shipping 6 2013-10-01 2013-10-01 false Procedure for acceptance of revisions of design,...

  10. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) The manufacturer shall not change the design, material, manufacturing process, or construction of a... requests for acceptance of revisions in design, material, manufacturing process, or construction of a non... 46 Shipping 6 2011-10-01 2011-10-01 false Procedure for acceptance of revisions of design,...

  11. 46 CFR 164.019-9 - Procedure for acceptance of revisions of design, process, or materials.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) The manufacturer shall not change the design, material, manufacturing process, or construction of a... requests for acceptance of revisions in design, material, manufacturing process, or construction of a non... 46 Shipping 6 2014-10-01 2014-10-01 false Procedure for acceptance of revisions of design,...

  12. Design and Testing of a Friction Stir Processing Machine for Laboratory Research

    SciTech Connect

    Karen S. Miller; Rodney J. Bitsoi; Eric D. Larsen; Herschel B. Smartt

    2006-08-01

    This presentation describes the design, fabrication and testing of a friction stir processing machine. The machine is intended to be a flexible research tool for a broad range of friction stir processing studies. The machine design also addresses the need for an affordable, robust design for general laboratory use.

  13. From Concept to Software: Developing a Framework for Understanding the Process of Software Design.

    ERIC Educational Resources Information Center

    Mishra, Punyashloke; Zhao, Yong; Tan, Sophia

    1999-01-01

    Discussion of technological innovation and the process of design focuses on the design of computer software. Offers a framework for understanding the design process by examining two computer programs: FliPS, a multimedia program for learning complex problems in chemistry; and Tiger, a Web-based program for managing and publishing electronic…

  14. The Design Studio as Teaching/Learning Medium--A Process-Based Approach

    ERIC Educational Resources Information Center

    Ozturk, Maya N.; Turkkan, Elif E.

    2006-01-01

    This article discusses a design studio teaching experience exploring the design process itself as a methodological tool. We consider the structure of important phases of the process that contain different levels of design thinking: conception, function and practical knowledge as well as the transitions from inception to construction. We show how…

  15. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... and physical construction prior to the completion of the NEPA process (contract hold points or another... 23 Highways 1 2014-04-01 2014-04-01 false How does the NEPA process relate to the...

  16. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... and physical construction prior to the completion of the NEPA process (contract hold points or another... 23 Highways 1 2012-04-01 2012-04-01 false How does the NEPA process relate to the...

  17. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... and physical construction prior to the completion of the NEPA process (contract hold points or another... 23 Highways 1 2011-04-01 2011-04-01 false How does the NEPA process relate to the...

  18. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... and physical construction prior to the completion of the NEPA process (contract hold points or another... 23 Highways 1 2013-04-01 2013-04-01 false How does the NEPA process relate to the...

  19. Design alternatives for process group membership and multicast

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Cooper, Robert; Gleeson, Barry

    1991-01-01

    Process groups are a natural tool for distributed programming, and are increasingly important in distributed computing environments. However, there is little agreement on the most appropriate semantics for process group membership and group communication. These issues are of special importance in the Isis system, a toolkit for distributed programming. Isis supports several styles of process group, and a collection of group communication protocols spanning a range of atomicity and ordering properties. This flexibility makes Isis adaptable to a variety of applications, but is also a source of complexity that limits performance. This paper reports on a new architecture that arose from an effort to simplify Isis process group semantics. Our findings include a refined notion of how the clients of a group should be treated, what the properties of a multicast primitive should be when systems contain large numbers of overlapping groups, and a new construct called the casuality domain. As an illustration, we apply the architecture to the problem of converting processes into fault-tolerant process groups in a manner that is 'transparent' to other processes in the system.

  20. Preparing Instructional Designers for Game-Based Learning: Part III. Game Design as a Collaborative Process

    ERIC Educational Resources Information Center

    Hirumi, Atsusi; Appelman, Bob; Rieber, Lloyd; Van Eck, Richard

    2010-01-01

    In this three part series, four professors who teach graduate level courses on the design of instructional video games discuss their perspectives on preparing instructional designers to optimize game-based learning. Part I set the context for the series and one of four panelists discussed what he believes instructional designers should know about…

  1. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  2. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  3. Process Design Manual: Wastewater Treatment Facilities for Sewered Small Communities.

    ERIC Educational Resources Information Center

    Leffel, R. E.; And Others

    This manual attempts to describe new treatment methods, and discuss the application of new techniques for more effectively removing a broad spectrum of contaminants from wastewater. Topics covered include: fundamental design considerations, flow equalization, headworks components, clarification of raw wastewater, activated sludge, package plants,…

  4. An Elective Course on Computer-Aided Process Design.

    ERIC Educational Resources Information Center

    Sommerfeld, Jude T.

    1979-01-01

    Describes an undergraduate chemical engineering course which has been offered at the Georgia Institute of Technology. The objectives, structure, instructional materials and content of this course, which emphasizes the structure and usage of computer-aided design systems, are also included. (HM)

  5. PROCESS DESIGN MANUAL FOR SLUDGE TREATMENT AND DISPOSAL

    EPA Science Inventory

    The purpose of this manual is to provide the engineering community and related industry with a new source of information to be used in the planning, design, and operation of present and future wastewater pollution control facilities. This manual supplements this existing knowledg...

  6. Natural Workgroups and the Process of Job Design.

    ERIC Educational Resources Information Center

    Fincham, Robin

    1989-01-01

    In the literature on industrial sociology and psychology, workgroups are discussed in terms of informal relations that unite cohesive groups. In job redesign they tend to be overlooked, possibly because there is a minimum of formal design in natural work settings or because informal relations at work are often viewed within a conflict framework.…

  7. Integrating the Affective Domain into the Instructional Design Process.

    ERIC Educational Resources Information Center

    Main, Robert G.

    This study begins with a definition of the affective domain and its importance to learning, outlining its impact both in achieving affective behaviors and in facilitating cognitive and psychomotor objectives. The study then develops a model of instructional design that incorporates the affective domain as an integral component. The model combines…

  8. Conjecture Mapping to Optimize the Educational Design Research Process

    ERIC Educational Resources Information Center

    Wozniak, Helen

    2015-01-01

    While educational design research promotes closer links between practice and theory, reporting its outcomes from iterations across multiple contexts is often constrained by the volumes of data generated, and the context bound nature of the research outcomes. Reports tend to focus on a single iteration of implementation without further research to…

  9. New Materials Design Through Friction Stir Processing Techniques

    SciTech Connect

    Buffa, G.; Fratini, L.; Shivpuri, R.

    2007-04-07

    Friction Stir Welding (FSW) has reached a large interest in the scientific community and in the last years also in the industrial environment, due to the advantages of such solid state welding process with respect to the classic ones. The complex material flow occurring during the process plays a fundamental role in such solid state welding process, since it determines dramatic changes in the material microstructure of the so called weld nugget, which affects the effectiveness of the joints. What is more, Friction Stir Processing (FSP) is mainly being considered for producing high-strain-rate-superplastic (HSRS) microstructure in commercial aluminum alloys. The aim of the present research is the development of a locally composite material through the Friction Stir Processing (FSP) of two AA7075-T6 blanks and a different material insert. The results of a preliminary experimental campaign, carried out at the varying of the additional material placed at the sheets interface under different conditions, are presented. Micro and macro observation of the such obtained joints permitted to investigate the effects of such process on the overall joint performance.

  10. Transparent process migration: Design alternatives and the Sprite implementation

    NASA Technical Reports Server (NTRS)

    Douglis, Fred; Ousterhout, John

    1991-01-01

    The Sprite operating system allows executing processes to be moved between hosts at any time. We use this process migration mechanism to offload work onto idle machines, and also to evict migrated processes when idle workstations are reclaimed by their owners. Sprite's migration mechanism provides a high degree of transparency both for migrated processes and for users. Idle machines are identified, and eviction is invoked, automatically by daemon processes. On Sprite it takes up to a few hundred milliseconds on SPARCstation 1 workstations to perform a remote exec, while evictions typically occur in a few seconds. The pmake program uses remote invocation to invoke tasks concurrently. Compilations commonly obtain speedup factors in the range of three to six; they are limited primarily by contention for centralized resources such as file servers. CPU-bound tasks such as simulations can make more effective use of idle hosts, obtaining as much as eight-fold speedup over a period of hours. Process migration has been in regular service for over two years.

  11. New Materials Design Through Friction Stir Processing Techniques

    NASA Astrophysics Data System (ADS)

    Buffa, G.; Fratini, L.; Shivpuri, R.

    2007-04-01

    Friction Stir Welding (FSW) has reached a large interest in the scientific community and in the last years also in the industrial environment, due to the advantages of such solid state welding process with respect to the classic ones. The complex material flow occurring during the process plays a fundamental role in such solid state welding process, since it determines dramatic changes in the material microstructure of the so called weld nugget, which affects the effectiveness of the joints. What is more, Friction Stir Processing (FSP) is mainly being considered for producing high-strain-rate-superplastic (HSRS) microstructure in commercial aluminum alloys. The aim of the present research is the development of a locally composite material through the Friction Stir Processing (FSP) of two AA7075-T6 blanks and a different material insert. The results of a preliminary experimental campaign, carried out at the varying of the additional material placed at the sheets interface under different conditions, are presented. Micro and macro observation of the such obtained joints permitted to investigate the effects of such process on the overall joint performance.

  12. The Role of Collaboration in a Comprehensive Programme Design Process in Inclusive Education

    ERIC Educational Resources Information Center

    Zundans-Fraser, Lucia; Bain, Alan

    2016-01-01

    This study focused on the role of collaboration in a comprehensive programme design process in inclusive education. The participants were six members of an inclusive education team and an educational designer who together comprised the design team. The study examined whether collaboration was evident in the practice of programme design and…

  13. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1995-04-01

    Advanced mathematical techniques and computer simulation play a major role in providing enhanced understanding of conventional and advanced materials processing operations. Development and application of mathematical models and computer simulation techniques can provide a quantitative understanding of materials processes and will minimize the need for expensive and time consuming trial- and error-based product development. As computer simulations and materials databases grow in complexity, high performance computing and simulation are expected to play a key role in supporting the improvements required in advanced material syntheses and processing by lessening the dependence on expensive prototyping and re-tooling. Many of these numerical models are highly compute-intensive. It is not unusual for an analysis to require several hours of computational time on current supercomputers despite the simplicity of the models being studied. For example, to accurately simulate the heat transfer in a 1-m{sup 3} block using a simple computational method requires 10`2 arithmetic operations per second of simulated time. For a computer to do the simulation in real time would require a sustained computation rate 1000 times faster than that achievable by current supercomputers. Massively parallel computer systems, which combine several thousand processors able to operate concurrently on a problem are expected to provide orders of magnitude increase in performance. This paper briefly describes advanced computational research in materials processing at ORNL. Continued development of computational techniques and algorithms utilizing the massively parallel computers will allow the simulation of conventional and advanced materials processes in sufficient generality.

  14. The Design, Testing and Operation of the IUE Data Processing Unit Power Supply

    NASA Technical Reports Server (NTRS)

    Gillis, J. A., Jr.

    1974-01-01

    The design and operation is reported of the power supply for the IUE data processing unit. Design specifications are presented along with performance data and parts selection. Illustrations show the completed circuit with and without its covers.

  15. Design and development of a layer-based additive manufacturing process for the realization of metal parts of designed mesostructure

    NASA Astrophysics Data System (ADS)

    Williams, Christopher Bryant

    Low-density cellular materials, metallic bodies with gaseous voids, are a unique class of materials that are characterized by their high strength, low mass, good energy absorption characteristics, and good thermal and acoustic insulation properties. In an effort to take advantage of this entire suite of positive mechanical traits, designers are tailoring the cellular mesostructure for multiple design objectives. Unfortunately, existing cellular material manufacturing technologies limit the design space as they are limited to certain part mesostructure, material type, and macrostructure. The opportunity that exists to improve the design of existing products, and the ability to reap the benefits of cellular materials in new applications is the driving force behind this research. As such, the primary research goal of this work is to design, embody, and analyze a manufacturing process that provides a designer the ability to specify the material type, material composition, void morphology, and mesostructure topology for any conceivable part geometry. The accomplishment of this goal is achieved in three phases of research: (1) Design---Following a systematic design process and a rigorous selection exercise, a layer-based additive manufacturing process is designed that is capable of meeting the unique requirements of fabricating cellular material geometry. Specifically, metal parts of designed mesostructure are fabricated via three-dimensional printing of metal oxide ceramic powder followed by post-processing in a reducing atmosphere. (2) Embodiment ---The primary research hypothesis is verified through the use of the designed manufacturing process chain to successfully realize metal parts of designed mesostructure. (3) Modeling & Evaluation ---The designed manufacturing process is modeled in this final research phase so as to increase understanding of experimental results and to establish a foundation for future analytical modeling research. In addition to an analysis of

  16. Seventeen Projects Carried out by Students Designing for and with Disabled Children: Identifying Designers' Difficulties during the Whole Design Process

    ERIC Educational Resources Information Center

    Magnier, Cecile; Thomann, Guillaume; Villeneuve, Francois

    2012-01-01

    This article aims to identify the difficulties that may arise when designing assistive devices for disabled children. Seventeen design projects involving disabled children, engineering students, and special schools were analysed. A content analysis of the design reports was performed. For this purpose, a coding scheme was built based on a review…

  17. Microwave sensor design for noncontact process monitoring at elevated temperature

    NASA Astrophysics Data System (ADS)

    Yadam, Yugandhara Rao; Arunachalam, Kavitha

    2016-02-01

    In this work we present a microwave sensor for noncontact monitoring of liquid level at high temperatures. The sensor is a high gain, directional conical lensed horn antenna with narrow beam width (BW) designed for operation over 10 GHz - 15 GHz. Sensor design and optimization was carried out using 3D finite element method based electromagnetic (EM) simulation software HFSS®. A rectangular to circular waveguide feed was designed to convert TE10 to TE11 mode for wave propagation in the conical horn. Swept frequency simulations were carried out to optimize antenna flare angle and length to achieve better than -10 dB return loss (S11), standing wave ratio (SWR) less than 2.0, 20° half power BW (HPBW) and 15 dB gain over 10 GHz - 15 GHz. The sensor was fabricated using Aluminum and was characterized in an anechoic test box using a vector network analyzer (E5071C, Agilent Technologies, USA). Experimental results of noncontact level detection are presented for boiling water in a metal canister.

  18. Design of a High-Throughput Plasma-Processing System

    SciTech Connect

    Darkazalli, Ghazi; Matthei, Keith; Ruby, Douglas S.

    1999-07-20

    Sandia National Laboratories has demonstrated significant performance gains in crystalline silicon solar cell technology through the use of plasma-processing for the deposition of silicon nitride by Plasma Enhanced Chemical Vapor Deposition (PECVD), plasma-hydrogenation of the nitride layer, and reactive-ion etching of the silicon surface prior to the deposition to decrease the reflectivity of the surface. One of the major problems of implementing plasma processing into a cell production line is the batch configuration and/or low throughput of the systems currently available. This report describes the concept of a new in-line plasma processing system that could meet the industrial requirements for a high-throughput and cost effective solution for mass production of solar cells.

  19. Process design for wastewater treatment: catalytic ozonation of organic pollutants.

    PubMed

    Derrouiche, S; Bourdin, D; Roche, P; Houssais, B; Machinal, C; Coste, M; Restivo, J; Orfão, J J M; Pereira, M F R; Marco, Y; Garcia-Bordeje, E

    2013-01-01

    Emerging micropollutants have been recently the target of interest for their potential harmful effects in the environment and their resistance to conventional water treatments. Catalytic ozonation is an advanced oxidation process consisting of the formation of highly reactive radicals from the decomposition of ozone promoted by a catalyst. Nanocarbon materials have been shown to be effective catalysts for this process, either in powder form or grown on the surface of a monolithic structure. In this work, carbon nanofibers grown on the surface of a cordierite honeycomb monolith are tested as catalyst for the ozonation of five selected micropollutants: atrazine (ATZ), bezafibrate, erythromycin, metolachlor, and nonylphenol. The process is tested both in laboratorial and real conditions. Later on, ATZ was selected as a target pollutant to further investigate the role of the catalytic material. It is shown that the inclusion of a catalyst improves the mineralization degree compared to single ozonation. PMID:24056437

  20. Microstructure Sensitive Design and Processing in Solid Oxide Electrolyzer Cell

    SciTech Connect

    Dr. Hamid Garmestani; Dr. Stephen Herring

    2009-06-12

    The aim of this study was to develop and inexpensive manufacturing process for deposition of functionally graded thin films of LSM oxides with porosity graded microstructures for use as IT-SOFCs cathode. The spray pyrolysis method was chosen as a low-temperature processing technique for deposition of porous LSM films onto dense YXZ substrates. The effort was directed toward the optimization of the processing conditions for deposition of high quality LSM films with variety of morphologies in the range of dense to porous microstructures. Results of optimization studies of spray parameters revealed that the substrate surface temperature is the most critical parameter influencing the roughness and morphology, porosity, cracking and crystallinity of the film.

  1. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  2. Rethinking the Purposes and Processes for Designing Digital Portfolios

    ERIC Educational Resources Information Center

    Hicks, Troy; Russo, Anne; Autrey, Tara; Gardner, Rebecca; Kabodian, Aram; Edington, Cathy

    2007-01-01

    As digital portfolios become more prevalent in teacher education, the purposes and processes for creating them have become contested. Originally meant to be critical and reflective spaces for learning about multimedia and conceived as contributing to professional growth, research shows that digital portfolios are now increasingly being used to…

  3. EVALUATING AND DESIGNING CHEMICAL PROCESSES FOR ENVIRONMENTAL SUSTAINABILITY

    EPA Science Inventory

    Chemicals and chemical processes are at the heart of most environmental problems. This isn't surprising since chemicals make up all of the products we use in our lives. The common use of cjhemicals makes them of high interest for systems analysis, particularly because of environ...

  4. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  5. Making Explicit in Design Education: Generic Elements in the Design Process

    ERIC Educational Resources Information Center

    van Dooren, Elise; Boshuizen, Els; van Merriënboer, Jeroen; Asselbergs, Thijs; van Dorst, Machiel

    2014-01-01

    In general, designing is conceived as a complex, personal, creative and open-ended skill. Performing a well-developed skill is mainly an implicit activity. In teaching, however, it is essential to make explicit. Learning a complex skill like designing is a matter of doing and becoming aware how to do it. For teachers and students therefore, it…

  6. Process designed for experimentation for increased-caliper Fresnel lenses

    SciTech Connect

    Zderad, A.J.

    1992-04-01

    The feasibility of producing increased caliper linear and point focus Fresnel lenses in a continuous sheet is described. Both a 8.16-inch-square radial 2 {times} 7 parquet, and a 22-inch-wide linear lens were produced at .11-inch in caliper. The primary purpose of this experimentation is to determine the replication effectiveness and production rate of the polymeric web process at increased thickness. The results demonstrated that both radial and linear lenses, at increased caliper, can be replicated with performance comparable to that of the current state-of-the-art 3M laminated lenses; however, the radial parquets were bowed on the edges. Additional process development is necessary to solve this problem. Current estimates are that the .11-inch caliper parquets cost significantly more than customer laminated parquets using 0.022-inch thick lensfilm.

  7. Process Options for Nominal 2-K Helium Refrigeration System Designs

    SciTech Connect

    Peter Knudsen, Venkatarao Ganni

    2012-07-01

    Nominal 2-K helium refrigeration systems are frequently used for superconducting radio frequency and magnet string technologies used in accelerators. This paper examines the trade-offs and approximate performance of four basic types of processes used for the refrigeration of these technologies; direct vacuum pumping on a helium bath, direct vacuum pumping using full or partial refrigeration recovery, cold compression, and hybrid compression (i.e., a blend of cold and warm sub-atmospheric compression).

  8. Design, processing and testing of LSI arrays: Hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.

    1979-01-01

    Mathematical cost factors were generated for both hybrid microcircuit and printed wiring board packaging methods. A mathematical cost model was created for analysis of microcircuit fabrication costs. The costing factors were refined and reduced to formulae for computerization. Efficient methods were investigated for low cost packaging of LSI devices as a function of density and reliability. Technical problem areas such as wafer bumping, inner/outer leading bonding, testing on tape, and tape processing, were investigated.

  9. Geothermal injection treatment: process chemistry, field experiences, and design options

    SciTech Connect

    Kindle, C.H.; Mercer, B.W.; Elmore, R.P.; Blair, S.C.; Myers, D.A.

    1984-09-01

    The successful development of geothermal reservoirs to generate electric power will require the injection disposal of approximately 700,000 gal/h (2.6 x 10/sup 6/ 1/h) of heat-depleted brine for every 50,000 kW of generating capacity. To maintain injectability, the spent brine must be compatible with the receiving formation. The factors that influence this brine/formation compatibility and tests to quantify them are discussed in this report. Some form of treatment will be necessary prior to injection for most situations; the process chemistry involved to avoid and/or accelerate the formation of precipitate particles is also discussed. The treatment processes, either avoidance or controlled precipitation approaches, are described in terms of their principles and demonstrated applications in the geothermal field and, when such experience is limited, in other industrial use. Monitoring techniques for tracking particulate growth, the effect of process parameters on corrosion and well injectability are presented. Examples of brine injection, preinjection treatment, and recovery from injectivity loss are examined and related to the aspects listed above.

  10. Towards health care process description framework: an XML DTD design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Aymard, S; Fieschi, D; Fieschi, M

    2001-01-01

    The development of health care and hospital information systems has to meet users needs as well as requirements such as the tracking of all care activities and the support of quality improvement. The use of process-oriented analysis is of-value to provide analysts with: (i) a systematic description of activities; (ii) the elicitation of the useful data to perform and record care tasks; (iii) the selection of relevant decision-making support. But paper-based tools are not a very suitable way to manage and share the documentation produced during this step. The purpose of this work is to propose a method to implement the results of process analysis according to XML techniques (eXtensible Markup Language). It is based on the IDEF0 activity modeling language (Integration DEfinition for Function modeling). A hierarchical description of a process and its components has been defined through a flat XML file with a grammar of proper metadata tags. Perspectives of this method are discussed. PMID:11825265

  11. Process Design Report for Stover Feedstock: Lignocellulosic Biomass to Ethanol Process Design and Economics Utilizing Co-Current Dilute Acid Prehydrolysis and Enzymatic Hydrolysis for Corn Stover

    SciTech Connect

    Aden, A.; Ruth, M.; Ibsen, K.; Jechura, J.; Neeves, K.; Sheehan, J.; Wallace, B.; Montague, L.; Slayton, A.; Lukas, J.

    2002-06-01

    The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update of the ongoing process design and economic analyses at NREL.

  12. INCORPORATING ENVIRONMENTAL AND ECONOMIC CONSIDERATIONS INTO PROCESS DESIGN: THE WASTE REDUCTION (WAR) ALGORITHM

    EPA Science Inventory

    A general theory known as the WAste Reduction (WASR) algorithm has been developed to describe the flow and the generation of potential environmental impact through a chemical process. This theory integrates environmental impact assessment into chemical process design Potential en...

  13. Behavioral modeling and simulation for the design process of aerospatial micro-instrumentation based on MEMS

    NASA Astrophysics Data System (ADS)

    Barrachina, L.; Lorente, B.; Ferrer, C.

    2006-05-01

    The extended use of microelectromechanical systems (MEMS) in the development of new microinstrumentation for aerospatial applications, which combine extreme sensitivity, accuracy and compactness, introduced the need to simplify their design process in order to reduce the design time and cost. The recent apparition of analogue and mixed signal extensions of hardware descriptions languages (VHDL-AMS, Verilog-AMS and SystemC-AMS) permits to co-simulate the HDL (VHDL and Verilog) design models for the digital signal processing and communication circuitry with behavioral models for the non digital parts (analog and mixed signal processing, RF circuitry and MEMS components). Since the beginning of the microinstrumentation design process the modeling and simulation could help to define better the specifications and in the architecture selection and in the SoC design process in a more realistic environment. We will present our experience in the application of these languages in the design of microinstruments by using behavioral modeling of MEMS.

  14. Design process of a photonics network for military platforms

    NASA Astrophysics Data System (ADS)

    Nelson, George F.; Rao, Nagarajan M.; Krawczak, John A.; Stevens, Rick C.

    1999-02-01

    Technology development in photonics is rapidly progressing. The concept of a Unified Network will provide re- configurable network access to platform sensors, Vehicle Management Systems, Stores and avionics. The re-configurable taps into the network will accommodate present interface standards and provide scaleability for the insertion of future interfaces. Significant to this development is the design and test of the Optical Backplane Interconnect System funded by Naval Air Systems Command and developed by Lockheed Martin Tactical Defense Systems - Eagan. OBIS results in the merging of the electrical backplane and the optical backplane, with interconnect fabric and card edge connectors finally providing adequate electrical and optical card access. Presently OBIS will support 1.2 Gb/s per fiber over multiples of 12 fibers per ribbon cable.

  15. Human performance model support for a human-centric design process

    NASA Astrophysics Data System (ADS)

    Campbell, Gwendolyn E.; Cannon-Bowers, Janis A.

    2000-11-01

    For years, systems designers following a traditional design process have made use of models of hardware and software. A human-centric design process imposes additional requirements and analyses on the designer, and we believe that additional types of models -- models of human performance -- are necessary to support this approach to design. Fortunately, there have been recent technological advances in our ability to model all aspects of human performance. This paper will describe three specific applications of human performance modeling that we are exploring to support the design of human- centric systems, such as future Navy ships. Specifically, this technology can be used to generate team design concepts, to provide human-centric decision support for systems engineers, and to allow simulation-based evaluation of human performance. We believe that human performance modeling technology has matured to the point where it can play a significant role in the human-centric design process, reducing both cost and risk.

  16. Biological sulfuric acid transformation: Reactor design and process optimization.

    PubMed

    Stucki, G; Hanselmann, K W; Hürzeler, R A

    1993-02-01

    As an alternative to the current disposal technologies for waste sulfuric acid, a new combination of recycling processes was developed. The strong acid (H(2)SO(4)) is biologically converted with the weak acid (CH(3)COOH) into two volatile weak acids (H(2)S, H(2)CO(3)) by sulfate-reducing bacteria. The transformation is possible without prior neutralization of the sulfuric acid. The microbially mediated transformation can be followed by physiochemical processes for the further conversion of the H(2)S.The reduction of sulfate to H(2)S is carried out under carbon-limited conditions at pH 7.5 to 8.5. A fixed-bed biofilm column reactor is used in conjunction with a separate gas-stripping column which was installed in the recycle stream. Sulfate, total sulfide, and the carbon substrate (in most cases acetate) were determined quantitatively. H(2)S and CO(2) are continually removed by stripping with N(2). Optimal removal is achieved under pH conditions which are adjusted to values below the pK(a)-values of the acids. The H(2)S concentration in the stripped gas was 2% to 8% (v/v) if H(2)SO(4) and CH(3)COOH are fed to the recycle stream just before the stripping column.Microbiol conversion rates of 65 g of sulfate reduced per liter of bioreactor volume per day are achieved and bacterial conversion efficiencies for sulfate of more than 95% can be maintained if the concentration of undissociated H(2)S is kept below 40 to 50 mg/L. Porous glass spheres, lava beads, and polyurethane pellets are useful matrices for the attachment of the bacterial biomass. Theoretical aspects and the dependence of the overall conversion performance on selected process parameters are illustrated in the Appendix to this article. PMID:18609554

  17. Advanced Simulation Technology to Design Etching Process on CMOS Devices

    NASA Astrophysics Data System (ADS)

    Kuboi, Nobuyuki

    2015-09-01

    Prediction and control of plasma-induced damage is needed to mass-produce high performance CMOS devices. In particular, side-wall (SW) etching with low damage is a key process for the next generation of MOSFETs and FinFETs. To predict and control the damage, we have developed a SiN etching simulation technique for CHxFy/Ar/O2 plasma processes using a three-dimensional (3D) voxel model. This model includes new concepts for the gas transportation in the pattern, detailed surface reactions on the SiN reactive layer divided into several thin slabs and C-F polymer layer dependent on the H/N ratio, and use of ``smart voxels''. We successfully predicted the etching properties such as the etch rate, polymer layer thickness, and selectivity for Si, SiO2, and SiN films along with process variations and demonstrated the 3D damage distribution time-dependently during SW etching on MOSFETs and FinFETs. We confirmed that a large amount of Si damage was caused in the source/drain region with the passage of time in spite of the existing SiO2 layer of 15 nm in the over etch step and the Si fin having been directly damaged by a large amount of high energy H during the removal step of the parasitic fin spacer leading to Si fin damage to a depth of 14 to 18 nm. By analyzing the results of these simulations and our previous simulations, we found that it is important to carefully control the dose of high energy H, incident energy of H, polymer layer thickness, and over-etch time considering the effects of the pattern structure, chamber-wall condition, and wafer open area ratio. In collaboration with Masanaga Fukasawa and Tetsuya Tatsumi, Sony Corporation. We thank Mr. T. Shigetoshi and Mr. T. Kinoshita of Sony Corporation for their assistance with the experiments.

  18. An interfaces approach to TES ground data system processing design with the Science Investigator-led Processing System (SIPS)

    NASA Technical Reports Server (NTRS)

    Kurian, R.; Grifin, A.

    2002-01-01

    Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

  19. Power of experimental design studies for the validation of pharmaceutical processes: case study of a multilayer tablet manufacturing process.

    PubMed

    Goutte, F; Guemguem, F; Dragan, C; Vergnault, G; Wehrlé, P

    2002-08-01

    Experimental design studies (EDS) are already widely used in the pharmaceutical industry for drug formulation or process optimization. Rare are the situations in which this methodology is applied for validation purposes. The power of this statistical tool, key element of a global validation strategy, is demonstrated for a multilayer tablet manufacturing process. Applied to the Geomatrix system generally composed of one compression and three granulation processes, time and strictness gains are non-negligible. Experimental design studies are not used in this work for modeling. Introduced at each important step of the process development, they allow for the evaluation of process ruggedness at pilot scale and specifications for full production. A demonstration of the complete control of key process parameters is given, identified throughout preliminary studies. PMID:12236070

  20. Resistance identification and rational process design in Capacitive Deionization.

    PubMed

    Dykstra, J E; Zhao, R; Biesheuvel, P M; van der Wal, A

    2016-01-01

    Capacitive Deionization (CDI) is an electrochemical method for water desalination employing porous carbon electrodes. To enhance the performance of CDI, identification of electronic and ionic resistances in the CDI cell is important. In this work, we outline a method to identify these resistances. We illustrate our method by calculating the resistances in a CDI cell with membranes (MCDI) and by using this knowledge to improve the cell design. To identify the resistances, we derive a full-scale MCDI model. This model is validated against experimental data and used to calculate the ionic resistances across the MCDI cell. We present a novel way to measure the electronic resistances in a CDI cell, as well as the spacer channel thickness and porosity after assembly of the MCDI cell. We identify that for inflow salt concentrations of 20 mM the resistance is mainly located in the spacer channel and the external electrical circuit, not in the electrodes. Based on these findings, we show that the carbon electrode thickness can be increased without significantly increasing the energy consumption per mol salt removed, which has the advantage that the desalination time can be lengthened significantly. PMID:26512814

  1. Algorithm design for a gun simulator based on image processing

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Wei, Ping; Ke, Jun

    2015-08-01

    In this paper, an algorithm is designed for shooting games under strong background light. Six LEDs are uniformly distributed on the edge of a game machine screen. They are located at the four corners and in the middle of the top and the bottom edges. Three LEDs are enlightened in the odd frames, and the other three are enlightened in the even frames. A simulator is furnished with one camera, which is used to obtain the image of the LEDs by applying inter-frame difference between the even and odd frames. In the resulting images, six LED are six bright spots. To obtain the LEDs' coordinates rapidly, we proposed a method based on the area of the bright spots. After calibrating the camera based on a pinhole model, four equations can be found using the relationship between the image coordinate system and the world coordinate system with perspective transformation. The center point of the image of LEDs is supposed to be at the virtual shooting point. The perspective transformation matrix is applied to the coordinate of the center point. Then we can obtain the virtual shooting point's coordinate in the world coordinate system. When a game player shoots a target about two meters away, using the method discussed in this paper, the calculated coordinate error is less than ten mm. We can obtain 65 coordinate results per second, which meets the requirement of a real-time system. It proves the algorithm is reliable and effective.

  2. Design of processes with reactive distillation line diagrams

    SciTech Connect

    Bessling, B.; Schembecker, G.; Simmrock, K.H.

    1997-08-01

    On the basis of the transformation of concentration coordinates, the concept of reactive distillation lines is developed. It is applied to study the feasibility of a reactive distillation with an equilibrium reaction on all trays of a distillation column. The singular points in the distillation line diagrams are characterized in terms of nodes and saddles. Depending on the characterization of the reactive distillation line diagrams, it can be decided whether a column with two feed stages is required. On the basis of the reaction space concept, a procedure for identification of reactive distillation processes is developed, in which the reactive distillation column has to be divided into reactive and nonreactive sections. This can be necessary to overcome the limitations in separation which result from the chemical equilibrium. The concentration profile of this combined reactive/nonreactive distillation column is estimated using combined reactive/nonreactive distillation lines.

  3. Is biomass fractionation by Organosolv-like processes economically viable? A conceptual design study.

    PubMed

    Viell, Jörn; Harwardt, Andreas; Seiler, Jan; Marquardt, Wolfgang

    2013-12-01

    In this work, the conceptual designs of the established Organosolv process and a novel biphasic, so-called Organocat process are developed and analyzed. Solvent recycling and energy integration are emphasized to properly assess economic viability. Both processes show a similar energy consumption (approximately 5 MJ/kg(dry biomass)). However, they still show a lack of economic attractiveness even at larger scale. The Organocat process is more favorable due to more efficient lignin separation. The analysis uncovers the remaining challenges toward an economically viable design. They largely originate from by-products formation, product isolation, and solvent recycling. Necessary improvements in process chemistry, equipment design, energy efficiency and process design are discussed to establish economically attractive Organosolv-like processes of moderate capacity as a building block of a future biorefinery. PMID:24157680

  4. 30 CFR 942.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... WITHIN EACH STATE TENNESSEE § 942.764 Process for designating areas unsuitable for surface coal mining... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Process for designating areas unsuitable for surface coal mining operations. 942.764 Section 942.764 Mineral Resources OFFICE OF SURFACE...

  5. 30 CFR 942.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... WITHIN EACH STATE TENNESSEE § 942.764 Process for designating areas unsuitable for surface coal mining... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Process for designating areas unsuitable for surface coal mining operations. 942.764 Section 942.764 Mineral Resources OFFICE OF SURFACE...

  6. 30 CFR 942.764 - Process for designating areas unsuitable for surface coal mining operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... WITHIN EACH STATE TENNESSEE § 942.764 Process for designating areas unsuitable for surface coal mining... 30 Mineral Resources 3 2013-07-01 2013-07-01 false Process for designating areas unsuitable for surface coal mining operations. 942.764 Section 942.764 Mineral Resources OFFICE OF SURFACE...

  7. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  8. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  9. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  10. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  11. 5 CFR Appendix A to Part 581 - List of Agents Designated To Accept Legal Process

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false List of Agents Designated To Accept Legal Process A Appendix A to Part 581 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROCESSING GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Pt. 581, App. A Appendix A to Part 581—List of Agents Designated...

  12. The Use of Executive Control Processes in Engineering Design by Engineering Students and Professional Engineers

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Johnson, Scott D.

    2012-01-01

    A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…

  13. 36 CFR 62.4 - Natural landmark designation and recognition process.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Natural landmark designation and recognition process. 62.4 Section 62.4 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL NATURAL LANDMARKS PROGRAM § 62.4 Natural landmark designation and recognition process. (a)...

  14. 36 CFR 62.4 - Natural landmark designation and recognition process.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Natural landmark designation and recognition process. 62.4 Section 62.4 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL NATURAL LANDMARKS PROGRAM § 62.4 Natural landmark designation and recognition process. (a)...

  15. Analyzing Team Based Engineering Design Process in Computer Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Lee, Dong-Kuk; Lee, Eun-Sang

    2016-01-01

    The engineering design process has been largely implemented in a collaborative project format. Recently, technological advancement has helped collaborative problem solving processes such as engineering design to have efficient implementation using computers or online technology. In this study, we investigated college students' interaction and…

  16. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  17. A Comparison of Diary Method Variations for Enlightening Form Generation in the Design Process

    ERIC Educational Resources Information Center

    Babapour, Maral; Rehammar, Bjorn; Rahe, Ulrike

    2012-01-01

    This paper presents two studies in which an empirical approach was taken to understand and explain form generation and decisions taken in the design process. In particular, the activities addressing aesthetic aspects when exteriorising form ideas in the design process have been the focus of the present study. Diary methods were the starting point…

  18. A Tutorial Design Process Applied to an Introductory Materials Engineering Course

    ERIC Educational Resources Information Center

    Rosenblatt, Rebecca; Heckler, Andrew F.; Flores, Katharine

    2013-01-01

    We apply a "tutorial design process", which has proven to be successful for a number of physics topics, to design curricular materials or "tutorials" aimed at improving student understanding of important concepts in a university-level introductory materials science and engineering course. The process involves the identification…

  19. A Critical Review of Instructional Design Process of Distance Learning System

    ERIC Educational Resources Information Center

    Chaudry, Muhammad Ajmal; ur-Rahman, Fazal

    2010-01-01

    Instructional design refers to planning, development, delivery and evaluation of instructional system. It is an applied field of study aiming at the application of descriptive research outcomes in regular instructional settings. The present study was designed to critically review the process of instructional design at Allama Iqbal Open University…

  20. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  1. Design and Construction Process of Two LEED Certified University Buildings: A Collective Case Study

    ERIC Educational Resources Information Center

    Rich, Kim

    2011-01-01

    This study was conducted at the early stages of integrating LEED into the design process in which a clearer understanding of what sustainable and ecological design was about became evident through the duration of designing and building of two academic buildings on a university campus. In this case study, due to utilizing a grounded theory…

  2. Design considerations for solar industrial process heat systems: nontracking and line focus collector technologies

    SciTech Connect

    Kutscher, C.F.

    1981-03-01

    Items are listed that should be considered in each aspect of the design of a solar industrial process heat system. The collector technologies covered are flat-plate, evacuated tube, and line focus. Qualitative design considerations are stressed rather than specific design recommendations. (LEW)

  3. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  4. Operational concepts and implementation strategies for the design configuration management process.

    SciTech Connect

    Trauth, Sharon Lee

    2007-05-01

    This report describes operational concepts and implementation strategies for the Design Configuration Management Process (DCMP). It presents a process-based systems engineering model for the successful configuration management of the products generated during the operation of the design organization as a business entity. The DCMP model focuses on Pro/E and associated activities and information. It can serve as the framework for interconnecting all essential aspects of the product design business. A design operation scenario offers a sense of how to do business at a time when DCMP is second nature within the design organization.

  5. Singlet oxygen sensitizing materials based on porous silicone: photochemical characterization, effect of dye reloading and application to water disinfection with solar reactors.

    PubMed

    Manjón, Francisco; Santana-Magaña, Montserrat; García-Fresnadillo, David; Orellana, Guillermo

    2010-06-01

    Photogeneration of singlet molecular oxygen ((1)O(2)) is applied to organic synthesis (photooxidations), atmosphere/water treatment (disinfection), antibiofouling materials and in photodynamic therapy of cancer. In this paper, (1)O(2) photosensitizing materials containing the dyes tris(4,4'-diphenyl-2,2'-bipyridine)ruthenium(II) (1, RDB(2+)) or tris(4,7-diphenyl-1,10-phenanthroline)ruthenium(II) (2, RDP(2+)), immobilized on porous silicone (abbreviated RDB/pSil and RDP/pSil), have been produced and tested for waterborne Enterococcus faecalis inactivation using a laboratory solar simulator and a compound parabolic collector (CPC)-based solar photoreactor. In order to investigate the feasibility of its reuse, the sunlight-exposed RDP/pSil sensitizing material (RDP/pSil-a) has been reloaded with RDP(2+) (RDP/pSil-r). Surprisingly, results for bacteria inactivation with the reloaded material have demonstrated a 4-fold higher efficiency compared to those of either RDP/pSil-a, unused RDB/pSil and the original RDP/pSil. Surface and bulk photochemical characterization of the new material (RDP/pSil-r) has shown that the bactericidal efficiency enhancement is due to aggregation of the silicone-supported photosensitizer on the surface of the polymer, as evidenced by confocal fluorescence lifetime imaging microscopy (FLIM). Photogenerated (1)O(2) lifetimes in the wet sensitizer-doped silicone have been determined to be ten times longer than in water. These facts, together with the water rheology in the solar reactor and the interfacial production of the biocidal species, account for the more effective disinfection observed with the reloaded photosensitizing material. These results extend and improve the operational lifetime of photocatalytic materials for point-of-use (1)O(2)-mediated solar water disinfection. PMID:20393668

  6. Conversion of microalgae to jet fuel: process design and simulation.

    PubMed

    Wang, Hui-Yuan; Bluck, David; Van Wie, Bernard J

    2014-09-01

    Microalgae's aquatic, non-edible, highly genetically modifiable nature and fast growth rate are considered ideal for biomass conversion to liquid fuels providing promise for future shortages in fossil fuels and for reducing greenhouse gas and pollutant emissions from combustion. We demonstrate adaptability of PRO/II software by simulating a microalgae photo-bio-reactor and thermolysis with fixed conversion isothermal reactors adding a heat exchanger for thermolysis. We model a cooling tower and gas floatation with zero-duty flash drums adding solids removal for floatation. Properties data are from PRO/II's thermodynamic data manager. Hydrotreating is analyzed within PRO/II's case study option, made subject to Jet B fuel constraints, and we determine an optimal 6.8% bioleum bypass ratio, 230°C hydrotreater temperature, and 20:1 bottoms to overhead distillation ratio. Process economic feasibility occurs if cheap CO2, H2O and nutrient resources are available, along with solar energy and energy from byproduct combustion, and hydrotreater H2 from product reforming. PMID:24997379

  7. Discussion: the design and analysis of the Gaussian process model

    SciTech Connect

    Williams, Brian J; Loeppky, Jason L

    2008-01-01

    The investigation of complex physical systems utilizing sophisticated computer models has become commonplace with the advent of modern computational facilities. In many applications, experimental data on the physical systems of interest is extremely expensive to obtain and hence is available in limited quantities. The mathematical systems implemented by the computer models often include parameters having uncertain values. This article provides an overview of statistical methodology for calibrating uncertain parameters to experimental data. This approach assumes that prior knowledge about such parameters is represented as a probability distribution, and the experimental data is used to refine our knowledge about these parameters, expressed as a posterior distribution. Uncertainty quantification for computer model predictions of the physical system are based fundamentally on this posterior distribution. Computer models are generally not perfect representations of reality for a variety of reasons, such as inadequacies in the physical modeling of some processes in the dynamic system. The statistical model includes components that identify and adjust for such discrepancies. A standard approach to statistical modeling of computer model output for unsampled inputs is introduced for the common situation where limited computer model runs are available. Extensions of the statistical methods to functional outputs are available and discussed briefly.

  8. Yucca Mountain Project: ESF Title I design control process review report

    SciTech Connect

    1989-01-19

    The Exploratory Shaft Facility (ESF) Title 1 Design Control Process Review was initiated in response to direction from the Office of Civilian Radioactive Waste Management (OCRWM) (letter: Kale to Gertz, NRC Concerns on Title 1 Design Control Process, November 17, 1988). The direction was to identify the existing documentation that described ``{hor_ellipsis} the design control process and the quality assurance that governed {hor_ellipsis}`` (a) the development of the requirements documents for the ESF design, (b) the various interfaces between activities, (c) analyses and definitions leading to additional requirements in the System Design Requirements Documents and, (d) completion of Title 1 Design. This report provides historical information for general use in determining the extent of the quality assurance program in existence during the ESF Title 1 Design.

  9. HAL/SM system functional design specification. [systems analysis and design analysis of central processing units

    NASA Technical Reports Server (NTRS)

    Ross, C.; Williams, G. P. W., Jr.

    1975-01-01

    The functional design of a preprocessor, and subsystems is described. A structure chart and a data flow diagram are included for each subsystem. Also a group of intermodule interface definitions (one definition per module) is included immediately following the structure chart and data flow for a particular subsystem. Each of these intermodule interface definitions consists of the identification of the module, the function the module is to perform, the identification and definition of parameter interfaces to the module, and any design notes associated with the module. Also described are compilers and computer libraries.

  10. Design criteria for Waste Coolant Processing Facility and preliminary proposal 722 for Waste Coolant Processing Facility

    SciTech Connect

    Not Available

    1991-09-27

    This document contains the design criteria to be used by the architect-engineer (A-E) in the performance of Titles 1 and 2 design for the construction of a facility to treat the biodegradable, water soluble, waste machine coolant generated at the Y-12 plant. The purpose of this facility is to reduce the organic loading of coolants prior to final treatment at the proposed West Tank Farm Treatment Facility.

  11. Virtual Display Design and Evaluation of Clothing: A Design Process Support System

    ERIC Educational Resources Information Center

    Zhang, Xue-Fang; Huang, Ren-Qun

    2014-01-01

    This paper proposes a new computer-aided educational system for clothing visual merchandising and display. It aims to provide an operating environment that supports the various stages of display design in a user-friendly and intuitive manner. First, this paper provides a brief introduction to current software applications in the field of…

  12. Application of the cost-per-good-die metric for process design co-optimization

    NASA Astrophysics Data System (ADS)

    Jhaveri, Tejas; Arslan, Umut; Rovner, Vyacheslav; Strojwas, Andrzej; Pileggi, Larry

    2010-03-01

    The semiconductor industry has pursued a rapid pace of technology scaling to achieve an exponential component cost reduction. Over the years the goal of technology scaling has been distilled down to two discrete targets. Process engineers focus on sustaining wafer costs, while manufacturing smaller dimensions whereas design engineers work towards creating newer IC designs that can feed the next generation of electronic products. In doing so, the impact of process choices made by manufacturing community on the design of ICs and vice-versa were conveniently ignored. Hoever, with the lack of cost effective lithography solutions at the forefront, the process and design communities are struggling to minimize IC die costs by following the described traditional scaling practices. In this paper we discuss a framework for quantifying the economic impact of design and process decisions on the overall product by comparing the cost-per-good-die. We discuss the intricacies involved in computing the cost-per-good-die as we make design and technology choices. We also discuss the impact of design and lithography choices for the 32nm and 22nm technology node. The results demonstrate a strong volume dependence on the optimum design style and corresponding lithography and strategy. Most importantly, using this framework process and design engineers can collaborate to define design style and lithography solutions that will lead to continued IC cost scaling.

  13. Some trends and proposals for the inclusion of sustainability in the design of manufacturing process

    NASA Astrophysics Data System (ADS)

    Fradinho, J.; Nedelcu, D.; Gabriel-Santos, A.; Gonçalves-Coelho, A.; Mourão, A.

    2015-11-01

    Production processes are designed to meet requirements of three different natures, quality, cost and time. Environmental concerns have expanded the field of conceptual design through the introduction of sustainability requirements that are driven by the growing societal thoughtfulness about environmental issues. One could say that the major concern has been the definition of metrics or indices for sustainability. However, those metrics usually have some lack of consistency. More than ever, there is a need for an all-inclusive view at any level of decision-making, from the establishing of the design requirements to the implementation of the solutions. According to the Axiomatic Design Theory, sustainable designs are usually coupled designs that should be avoided. This raises a concern related to the very nature of sustainability: the cross effects between the actions that should be considered in the attempt to decouple the design solutions. In terms of production, one should clarify the characterization of the sustainability of production systems. The objectives of this paper are: i) to analyse some trends for approaching the sustainability of the production processes; ii) to define sustainability in terms of requirements for the design of the production processes; iii) to make some proposals based on the Axiomatic Design Theory, in order to establish the principles with which the guidelines for designing production processes must comply; iv) to discuss how to introduce this matter in teaching both manufacturing technology and design of production systems.

  14. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    SciTech Connect

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  15. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  16. Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin; Hull, Patrick V.

    2015-01-01

    Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.

  17. The Chandra automated processing system: challenges, design enhancements, and lessons learned

    NASA Astrophysics Data System (ADS)

    Plummer, David; Grier, John; Masters, Sreelatha

    2006-06-01

    Chandra standard data processing involves hundreds of different types of data products and pipelines. Pipelines are initiated by different types of events or notifications and may depend upon many other pipelines for input data. The Chandra automated processing system (AP) was designed to handle the various notifications and orchestrate the pipeline processing. Certain data sets may require "special" handling that deviates slightly from the standard processing thread. Also, bulk reprocessing of data often involves new processing requirements. Most recently, a new type of processing to produce source catalogs has introduced requirements not anticipated by the original AP design. Managing these complex dependencies and evolving processing requirements in an efficient, flexible, and automated fashion presents many challenges. This paper describes the most significant of these challenges, the AP design changes required to address these issues and the lessons learned along the way.

  18. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  19. Design Considerations of Polishing Lap for Computer-Controlled Cylindrical Polishing Process

    NASA Technical Reports Server (NTRS)

    Khan, Gufran S.; Gubarev, Mikhail; Arnold, William; Ramsey, Brian D.

    2009-01-01

    This paper establishes a relationship between the polishing process parameters and the generation of mid spatial-frequency error. The consideration of the polishing lap design to optimize the process in order to keep residual errors to a minimum and optimization of the process (speeds, stroke, etc.) and to keep the residual mid spatial-frequency error to a minimum, is also presented.

  20. Function-based design process for an intelligent ground vehicle vision system

    NASA Astrophysics Data System (ADS)

    Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.

    2010-10-01

    An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.

  1. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  2. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  3. The Matrix Reloaded: How Sensing the Extracellular Matrix Synchronizes Bacterial Communities

    PubMed Central

    Steinberg, Nitai

    2015-01-01

    In response to chemical communication, bacterial cells often organize themselves into complex multicellular communities that carry out specialized tasks. These communities are frequently referred to as biofilms, which involve the collective behavior of different cell types. Like cells of multicellular eukaryotes, the biofilm cells are surrounded by self-produced polymers that constitute the extracellular matrix (ECM), which binds them to each other and to the surface. In multicellular eukaryotes, it has been evident for decades that cell-ECM interactions control multiple cellular processes during development. While cells both in biofilms and in multicellular eukaryotes are surrounded by ECM and activate various genetic programs, until recently it has been unclear whether cell-ECM interactions are recruited in bacterial communicative behaviors. In this review, we describe the examples reported thus far for ECM involvement in control of cell behavior throughout the different stages of biofilm formation. The studies presented in this review have provided a newly emerging perspective of the bacterial ECM as an active player in regulation of biofilm development. PMID:25825428

  4. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  5. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. PMID:24309506

  6. Integration of computer-aided design and manufacturing through artificial-intelligence-based process planning

    SciTech Connect

    Arunthavanathan, V.

    1988-01-01

    The research effort reported in this thesis is directed towards the integration of design, process planning, and manufacturing. The principal notion used in system integration through information integration. The main outcome of this research effort is an artificial-intelligence-based computer-aided generative process planning system, which would use a feature-based symbolic geometry as its input. The feature-based symbolic data structure is used as the common data between design, process planning, and manufacturing. As the commercial computer-aided design systems would not generate a feature-based data base, special interfaces are designed and used. As part of the solution strategy, a module to analyze the symbolic geometry from a global perspective is developed. This module imitates a human process planner and derives some overall assertions. The enhanced geometry data is then used by a rule-based expert system to develop the process plan.

  7. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  8. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  9. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  10. Detailed design procedure for solar industrial-process-heat systems: overview

    SciTech Connect

    Kutscher, C F

    1982-12-01

    A large number of handbooks have been written on the subject of designing solar heating and cooling systems for buildings. One of these is summarized here. Design Approaches for Solar Industrial Process Heat Systems, published in September 1982, addresses the complete spectrum of problems associated with the design of a solar IPH system. A highly general method, derived from computer simulations, is presented for determining actual energy delivered to the process load. Also covered are siting and selection of subsystem components, cost estimation, safety and environmental considerations, and installation concerns. An overview of the design methodology developed is given and some specific examples of technical issues addressed are provided.

  11. The Iterative Design Process in Research and Development: A Work Experience Paper

    NASA Technical Reports Server (NTRS)

    Sullivan, George F. III

    2013-01-01

    The iterative design process is one of many strategies used in new product development. Top-down development strategies, like waterfall development, place a heavy emphasis on planning and simulation. The iterative process, on the other hand, is better suited to the management of small to medium scale projects. Over the past four months, I have worked with engineers at Johnson Space Center on a multitude of electronics projects. By describing the work I have done these last few months, analyzing the factors that have driven design decisions, and examining the testing and verification process, I will demonstrate that iterative design is the obvious choice for research and development projects.

  12. Computer-aided designing of automatic process control systems for thermal power stations

    NASA Astrophysics Data System (ADS)

    Trofimov, A. V.

    2009-10-01

    The structure of modern microprocessor systems for automated control of technological processes at cogeneration stations is considered. Methods for computer-aided designing of the lower (sensors and actuators) and upper (cabinets of computerized automation equipment) levels of an automated process control system are proposed. The composition of project documents, the structures of a project database and database of a computer-aided design system, and the way they interact with one another in the course of developing the project of an automated process control system are described. Elements of the interface between a design engineer and computer program are shown.

  13. Disruption of adaptor protein 2μ (AP-2μ) in cochlear hair cells impairs vesicle reloading of synaptic release sites and hearing.

    PubMed

    Jung, SangYong; Maritzen, Tanja; Wichmann, Carolin; Jing, Zhizi; Neef, Andreas; Revelo, Natalia H; Al-Moyed, Hanan; Meese, Sandra; Wojcik, Sonja M; Panou, Iliana; Bulut, Haydar; Schu, Peter; Ficner, Ralf; Reisinger, Ellen; Rizzoli, Silvio O; Neef, Jakob; Strenzke, Nicola; Haucke, Volker; Moser, Tobias

    2015-11-01

    Active zones (AZs) of inner hair cells (IHCs) indefatigably release hundreds of vesicles per second, requiring each release site to reload vesicles at tens per second. Here, we report that the endocytic adaptor protein 2μ (AP-2μ) is required for release site replenishment and hearing. We show that hair cell-specific disruption of AP-2μ slows IHC exocytosis immediately after fusion of the readily releasable pool of vesicles, despite normal abundance of membrane-proximal vesicles and intact endocytic membrane retrieval. Sound-driven postsynaptic spiking was reduced in a use-dependent manner, and the altered interspike interval statistics suggested a slowed reloading of release sites. Sustained strong stimulation led to accumulation of endosome-like vacuoles, fewer clathrin-coated endocytic intermediates, and vesicle depletion of the membrane-distal synaptic ribbon in AP-2μ-deficient IHCs, indicating a further role of AP-2μ in clathrin-dependent vesicle reformation on a timescale of many seconds. Finally, we show that AP-2 sorts its IHC-cargo otoferlin. We propose that binding of AP-2 to otoferlin facilitates replenishment of release sites, for example, via speeding AZ clearance of exocytosed material, in addition to a role of AP-2 in synaptic vesicle reformation. PMID:26446278

  14. Developing Elementary Math and Science Process Skills Through Engineering Design Instruction

    NASA Astrophysics Data System (ADS)

    Strong, Matthew G.

    This paper examines how elementary students can develop math and science process skills through an engineering design approach to instruction. The performance and development of individual process skills overall and by gender were also examined. The study, preceded by a pilot, took place in a grade four extracurricular engineering design program in a public, suburban school district. Students worked in pairs and small groups to design and construct airplane models from styrofoam, paper clips, and toothpicks. The development and performance of process skills were assessed through a student survey of learning gains, an engineering design packet rubric (student work), observation field notes, and focus group notes. The results indicate that students can significantly develop process skills, that female students may develop process skills through engineering design better than male students, and that engineering design is most helpful for developing the measuring, suggesting improvements, and observing process skills. The study suggests that a more regular engineering design program or curriculum could be beneficial for students' math and science abilities both in this school and for the elementary field as a whole.

  15. Defining process design space for biotech products: case study of Pichia pastoris fermentation.

    PubMed

    Harms, Jean; Wang, Xiangyang; Kim, Tina; Yang, Xiaoming; Rathore, Anurag S

    2008-01-01

    The concept of "design space" has been proposed in the ICH Q8 guideline and is gaining momentum in its application in the biotech industry. It has been defined as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality." This paper presents a stepwise approach for defining process design space for a biologic product. A case study, involving P. pastoris fermentation, is presented to facilitate this. First, risk analysis via Failure Modes and Effects Analysis (FMEA) is performed to identify parameters for process characterization. Second, small-scale models are created and qualified prior to their use in these experimental studies. Third, studies are designed using Design of Experiments (DOE) in order for the data to be amenable for use in defining the process design space. Fourth, the studies are executed and the results analyzed for decisions on the criticality of the parameters as well as on establishing process design space. For the application under consideration, it is shown that the fermentation unit operation is very robust with a wide design space and no critical operating parameters. The approach presented here is not specific to the illustrated case study. It can be extended to other biotech unit operations and processes that can be scaled down and characterized at small scale. PMID:18412404

  16. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    SciTech Connect

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant`s or licensee`s HSI design.

  17. Holistic and Consistent Design Process for Hollow Structures Based on Braided Textiles and RTM

    NASA Astrophysics Data System (ADS)

    Gnädinger, Florian; Karcher, Michael; Henning, Frank; Middendorf, Peter

    2014-06-01

    The present paper elaborates a holistic and consistent design process for 2D braided composites in conjunction with Resin Transfer Moulding (RTM). These technologies allow a cost-effective production of composites due to their high degree of automation. Literature can be found that deals with specific tasks of the respective technologies but there is no work available that embraces the complete process chain. Therefore, an overall design process is developed within the present paper. It is based on a correlated conduction of sub-design processes for the braided preform, RTM-injection, mandrel plus mould and manufacturing. For each sub-process both, individual tasks and reasonable methods to accomplish them are presented. The information flow within the design process is specified and interdependences are illustrated. Composite designers will be equipped with an efficient set of tools because the respective methods regard the complexity of the part. The design process is applied for a demonstrator in a case study. The individual sub-design processes are accomplished exemplarily to judge about the feasibility of the presented work. For validation reasons, predicted braiding angles and fibre volume fractions are compared with measured ones and a filling and curing simulation based on PAM-RTM is checked against mould filling studies. Tool concepts for a RTM mould and mandrels that realise undercuts are tested. The individual process parameters for manufacturing are derived from previous design steps. Furthermore, the compatibility of the chosen fibre and matrix system is investigated based on pictures of a scanning electron microscope (SEM). The annual production volume of the demonstrator part is estimated based on these findings.

  18. Optimal cure cycle design for autoclave processing of thick composites laminates: A feasibility study

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.

    1985-01-01

    The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.

  19. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  20. An Effective Design Process for the Successful Development of Medical Devices

    NASA Astrophysics Data System (ADS)

    Colvin, Mike

    The most important point in the successful development of a medical device is the proper overall design. The quality, safety, and effectiveness of a device are established during the design phase. The design process is the foundation of the medical device and will be the basis for the device from its inception till the end of its lifetime. There are domestic and international guidelines on the proper steps to develop a medical device. However, these are guides; they do not specify when and how to implement each phase of design control. The guides also do not specify to what depth an organization must go as it progresses in the overall developmental process. The challenge that faces development organizations is to create a design process plan that is simple, straightforward, and not overburdening.

  1. Design and implementation of the parallel processing system of multi-channel polarization images

    NASA Astrophysics Data System (ADS)

    Li, Zhi-yong; Huang, Qin-chao

    2013-08-01

    Compared with traditional optical intensity image processing, polarization images processing has two main problems. One is that the amount of data is larger. The other is that processing tasks is more complex. To resolve these problems, the parallel processing system of multi-channel polarization images is designed by the multi-DSP technique. It contains a communication control unit (CCU) and a data processing array (DPA). CCU controls communications inside and outside the system. Its logics are designed by a FPGA chip. DPA is made up of four Digital Signal Processor (DSP) chips, which are interlinked by the loose coupling method. DPA implements processing tasks including images registration and images synthesis by parallel processing methods. The polarization images parallel processing model is designed on multi levels including the system task, the algorithm and the operation. Its program is designed by the assemble language. While the polarization image resolution is 782x582 pixels, the pixel data length is 12 bits in the experiment. After it received 3 channels of polarization image simultaneously, this system implements parallel task to acquire the target polarization characteristics. Experimental results show that this system has good real-time and reliability. The processing time of images registration is 293.343ms while the registration accuracy achieves 0.5 pixel. The processing time of images synthesis is 3.199ms.

  2. Comprehensive design and process flow configuration for micro and nano tech devices

    NASA Astrophysics Data System (ADS)

    Hahn, Kai; Schmidt, Thilo; Mielke, Matthias; Ortloff, Dirk; Popp, Jens; Brück, Rainer

    2010-04-01

    The development of micro and nano tech devices based on semiconductor manufacturing processes comprises the structural design as well as the definition of the manufacturing process flow. The approach is characterized by application specific fabrication flows, i.e. fabrication processes (built up by a large variety of process steps and materials) depending on the later product. Technology constraints have a great impact on the device design and vice-versa. In this paper we introduce a comprehensive methodology and based on that an environment for customer-oriented product engineering of MEMS products. The development is currently carried out in an international multi-site research project.

  3. New process modeling [sic], design, and control strategies for energy efficiency, high product quality, and improved productivity in the process industries. Final project report

    SciTech Connect

    Ray, W. Harmon

    2002-06-05

    This project was concerned with the development of process design and control strategies for improving energy efficiency, product quality, and productivity in the process industries. In particular, (i) the resilient design and control of chemical reactors, and (ii) the operation of complex processing systems, was investigated. Specific topics studied included new process modeling procedures, nonlinear controller designs, and control strategies for multiunit integrated processes. Both fundamental and immediately applicable results were obtained. The new design and operation results from this project were incorporated into computer-aided design software and disseminated to industry. The principles and design procedures have found their way into industrial practice.

  4. DESIGN OF A TRAP GREASE UPGRADER FOR BIOFUEL PROCESSING - PHASE I

    EPA Science Inventory

    This project provides capstone senior design experience to several teams of engineering undergraduates at Drexel University through the technical and economic evaluation of a trap grease to biodiesel conversion process. The project incorporates two phases: Phase I characteri...

  5. A Systematic Instructional Design Strategy Derived from Information-Processing Theory.

    ERIC Educational Resources Information Center

    Bell, Margaret E.

    1981-01-01

    Recommends an instructional design strategy derived from information processing theory and research which would include the planned presentation of retrieval events and encourage learners to manage some aspects of their own learning. Twelve references are listed. (MER)

  6. GREENER CHEMICAL PROCESS DESIGN ALTERNATIVES ARE REVEALED USING THE WASTE REDUCTION DECISION SUPPORT SYSTEM (WAR DSS)

    EPA Science Inventory

    The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...

  7. Climbing The Knowledge Mountain - The New Solids Processing Design And Management Manual (Presentation)

    EPA Science Inventory

    The USEPA, Water Environment Federation (WEF) and Water Environment Research Foundation (WERF), under a Cooperative Research and Development Agreement (CRADA), are undertaking a massive effort to produce a Solids Processing Design and Management Manual (Manual). The Manual, repr...

  8. Climbing The Knowledge Mountain - The New Solids Processing Design And Management Manual

    EPA Science Inventory

    The USEPA, Water Environment Federation (WEF) and Water Environment Research Foundation (WERF), under a Cooperative Research and Development Agreement (CRADA), are undertaking a massive effort to produce a Solids Processing Design and Management Manual (Manual). The Manual, repr...

  9. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  10. EVALUATING THE ECONOMICS AND ENVIRONMENTAL FRIENDLINESS OF NEWLY DESIGNED OR RETROFITTED CHEMICAL PROCESSES

    EPA Science Inventory

    This work describes a method for using spreadsheet analyses of process designs and retrofits to provide simple and quick economic and environmental evaluations simultaneously. The method focuses attention onto those streams and components that have the largest monetary values and...

  11. CENSUS DESIGNATED PLACE AND FEDERAL INFORMATION PROCESSING STANDARD (FIPS) POPULATED PLACEBOUNDARIES FOR THE STATE OF ARIZONA

    EPA Science Inventory

    This data set consists of Census Designated Place and Federal Information Processing Standard (FIPS) Populated Place boundaries for the State of Arizona which were extracted from the 1992 U.S. Census Bureau TIGER line files.

  12. Design of visual prosthesis image processing system based on SoC

    NASA Astrophysics Data System (ADS)

    Guo, Fei; Yang, Yuan; Gao, Yong; Wu, Chuan Ke

    2014-07-01

    This paper presents a visual prosthesis image processing system based on Leon3 SoC (System on Chip) platform. The system is built through GRLIB system development platform. It integrates the image preprocessing IP core, image encoder IP core and image data modulation IP core we designed. We transplant the system to the FPGA development board and verify the system functions. The results show that the designed system can achieve the functions of visual prosthesis image processing system effectively.

  13. ALARA Design Review for the Resumption of the Plutonium Finishing Plant (PFP) Cementation Process Project Activities

    SciTech Connect

    DAYLEY, L.

    2000-06-14

    The requirements for the performance of radiological design reviews are codified in 10CFR835, Occupational Radiation Protection. The basic requirements for the performance of ALARA design reviews are presented in the Hanford Site Radiological Control Manual (HSRCM). The HSRCM has established trigger levels requiring radiological reviews of non-routine or complex work activities. These requirements are implemented in site procedures HNF-PRO-1622 and 1623. HNF-PRO-1622 Radiological Design Review Process requires that ''radiological design reviews [be performed] of new facilities and equipment and modifications of existing facilities and equipment''. In addition, HNF-PRO-1623 Radiological Work Planning Process requires a formal ALARA Review for planned activities that are estimated to exceed 1 person-rem total Dose Equivalent (DE). The purpose of this review is to validate that the original design for the PFP Cementation Process ensures that the principles of ALARA (As Low As Reasonably Achievable) were included in the original project design. That is, that the design and operation of existing Cementation Process equipment and processes allows for the minimization of personnel exposure in its operation, maintenance and decommissioning and that the generation of radioactive waste is kept to a minimum.

  14. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes of 24…

  15. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  16. Architectural Design: An American Indian Process. An Interview with Dennis Sun Rhodes.

    ERIC Educational Resources Information Center

    Barreiro, Jose

    1990-01-01

    A Northern Arapaho architect discusses his design process, which uses American Indian cultures, symbols, and attitudes as creative inspiration; his use of space and design elements from aboriginal housing styles; and his experiences with HUD and the Bureau of Indian Affairs Housing Improvement Program. (SV)

  17. Learning from Experts: Fostering Extended Thinking in the Early Phases of the Design Process

    ERIC Educational Resources Information Center

    Haupt, Grietjie

    2015-01-01

    Empirical evidence on the way in which expert designers from different domains cognitively connect their internal processes with external resources is presented in the context of an extended cognition model. The article focuses briefly on the main trends in the extended design cognition theory and in particular on recent trends in information…

  18. Influences of Training and Strategical Information Processing Style on Spatial Performance in Apparel Design

    ERIC Educational Resources Information Center

    Gitimu, Priscilla N.; Workman, Jane E.; Anderson, Marcia A.

    2005-01-01

    The study investigated how performance on a spatial task in apparel design was influenced by training and strategical information processing style. The sample consisted of 278 undergraduate apparel design students from six universities in the U.S. Instruments used to collect data were the Apparel Spatial Visualization Test (ASVT) and the…

  19. Learning Effects of a Science Textbook Designed with Adapted Cognitive Process Principles on Grade 5 Students

    ERIC Educational Resources Information Center

    Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho

    2015-01-01

    This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…

  20. 36 CFR 62.4 - Natural landmark designation and recognition process.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Natural landmark designation..., DEPARTMENT OF THE INTERIOR NATIONAL NATURAL LANDMARKS PROGRAM § 62.4 Natural landmark designation and recognition process. (a) Identification. Potential national natural landmarks are identified in the...