Science.gov

Sample records for reload design process

  1. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  2. Modeling and design of a reload PWR core for a 48-month fuel cycle

    SciTech Connect

    McMahon, M.V.; Driscoll, M.J.; Todreas, N.E.

    1997-05-01

    The objective of this research was to use state-of-the-art nuclear and fuel performance packages to evaluate the feasibility and costs of a 48 calendar month core in existing pressurized water reactor (PWR) designs, considering the full range of practical design and economic considerations. The driving force behind this research is the desire to make nuclear power more economically competitive with fossil fuel options by expanding the scope for achievement of higher capacity factors. Using CASMO/SIMULATE, a core design with fuel enriched to 7{sup w}/{sub o} U{sup 235} for a single batch loaded, 48-month fuel cycle has been developed. This core achieves an ultra-long cycle length without exceeding current fuel burnup limits. The design uses two different types of burnable poisons. Gadolinium in the form of gadolinium oxide (Gd{sub 2}O{sub 3}) mixed with the UO{sub 2} of selected pins is sued to hold down initial reactivity and to control flux peaking throughout the life of the core. A zirconium di-boride (ZrB{sub 2}) integral fuel burnable absorber (IFBA) coating on the Gd{sub 2}O{sub 3}-UO{sub 2} fuel pellets is added to reduce the critical soluble boron concentration in the reactor coolant to within acceptable limits. Fuel performance issues of concern to this design are also outlined and areas which will require further research are highlighted.

  3. From Reload to ReCourse: Learning from IMS Learning Design Implementations

    ERIC Educational Resources Information Center

    Griffiths, David; Beauvoir, Phillip; Liber, Oleg; Barrett-Baxendale, Mark

    2009-01-01

    The use of the Web to deliver open, distance, and flexible learning has opened up the potential for social interaction and adaptive learning, but the usability, expressivity, and interoperability of the available tools leave much to be desired. This article explores these issues as they relate to teachers and learning designers through the case of…

  4. Whorf Reloaded: Language Effects on Nonverbal Number Processing in First Grade--A Trilingual Study

    ERIC Educational Resources Information Center

    Pixner, S.; Moeller, K.; Hermanova, V.; Nuerk, H. -C.; Kaufmann, L.

    2011-01-01

    The unit-decade compatibility effect is interpreted to reflect processes of place value integration in two-digit number magnitude comparisons. The current study aimed at elucidating the influence of language properties on the compatibility effect of Arabic two-digit numbers in Austrian, Italian, and Czech first graders. The number word systems of…

  5. Whorf reloaded: language effects on nonverbal number processing in first grade--a trilingual study.

    PubMed

    Pixner, S; Moeller, K; Hermanova, V; Nuerk, H-C; Kaufmann, L

    2011-02-01

    The unit-decade compatibility effect is interpreted to reflect processes of place value integration in two-digit number magnitude comparisons. The current study aimed at elucidating the influence of language properties on the compatibility effect of Arabic two-digit numbers in Austrian, Italian, and Czech first graders. The number word systems of the three countries differ with respect to their correspondence between name and place value systems; the German language is characterized by its inversion of the order of tens and units in number words as compared with digital notations, whereas Italian number words are generally not inverted and there are both forms for Czech number words. Interestingly, the German-speaking children showed the most pronounced compatibility effect with respect to both accuracy and speed. We interpret our results as evidence for a detrimental influence of an intransparent number word system place value processing. The data corroborate a weak Whorfian hypothesis in children, with even nonverbal Arabic number processing seeming to be influenced by linguistic properties in children. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. Optimal reload strategies for identify-and-destroy missions

    NASA Astrophysics Data System (ADS)

    Hyland, John C.; Smith, Cheryl M.

    2004-09-01

    In this problem an identification vehicle must re-acquire a fixed set of suspected targets and determine whether each suspected target is a mine or a false alarm. If a target is determined to be a mine, the identification vehicle must neutralize it by either delivering one of a limited number of on-board bombs or by assigning the neutralization task to one of a limited number of single-shot suicide vehicles. The identification vehicle has the option to reload. The singleshot suicide vehicles, however, cannot be replenished. We have developed an optimal path planning and reload strategy for this identify and destroy mission that takes into account the probabilities that suspected targets are mines, the costs to move between targets, the costs to return to and from the reload point, and the cost to reload. The mission is modeled as a discrete multi-dimensional Markov process. At each target position the vehicle decides based on the known costs, probabilities, the number of bombs on board (r), and the number of remaining one-shot vehicles (s) whether to move directly on to the next target or to reload before continuing and whether to destroy any mine with an on-board bomb or a one-shot suicide vehicle. The approach recursively calculates the minimum expected overall cost conditioned on all possible values r and s. The recursion is similar to dynamic programming in that it starts at the last suspected target location and works its way backwards to the starting point. The approach also uses a suboptimal traveling salesman strategy to search over candidate deployment locations to calculate the best initial deployment point where the reloads will take place.

  7. NASA reload program

    NASA Technical Reports Server (NTRS)

    Byington, Marshall

    1993-01-01

    Atlantic Research Corporation (ARC) contracted with NASA to manufacture and deliver thirteen small scale Solid Rocket Motors (SRM). These motors, containing five distinct propellant formulations, will be used for plume induced radiation studies. The information contained herein summarizes and documents the program accomplishments and results. Several modifications were made to the scope of work during the course of the program. The effort was on hold from late 1991 through August, 1992 while propellant formulation changes were developed. Modifications to the baseline program were completed in late-August and Modification No. 6 was received by ARC on September 14, 1992. The modifications include changes to the propellant formulation and the nozzle design. The required motor deliveries were completed in late-December, 1992. However, ARC agreed to perform an additional mix and cast effort at no cost to NASA and another motor was delivered in March, 1993.

  8. The Heliogyro Reloaded

    NASA Technical Reports Server (NTRS)

    Wilkie, William K.; Warren, Jerry E.; Thompson, M. W.; Lisman, P. D.; Walkemeyer, P. E.; Guerrant, D. V.; Lawrence, D. A.

    2011-01-01

    The heliogyro is a high-performance, spinning solar sail architecture that uses long - order of kilometers - reflective membrane strips to produce thrust from solar radiation pressure. The heliogyro s membrane blades spin about a central hub and are stiffened by centrifugal forces only, making the design exceedingly light weight. Blades are also stowed and deployed from rolls; eliminating deployment and packaging problems associated with handling extremely large, and delicate, membrane sheets used with most traditional square-rigged or spinning disk solar sail designs. The heliogyro solar sail concept was first advanced in the 1960s by MacNeal. A 15 km diameter version was later extensively studied in the 1970s by JPL for an ambitious Comet Halley rendezvous mission, but ultimately not selected due to the need for a risk-reduction flight demonstration. Demonstrating system-level feasibility of a large, spinning heliogyro solar sail on the ground is impossible; however, recent advances in microsatellite bus technologies, coupled with the successful flight demonstration of reflectance control technologies on the JAXA IKAROS solar sail, now make an affordable, small-scale heliogyro technology flight demonstration potentially feasible. In this paper, we will present an overview of the history of the heliogyro solar sail concept, with particular attention paid to the MIT 200-meter-diameter heliogyro study of 1989, followed by a description of our updated, low-cost, heliogyro flight demonstration concept. Our preliminary heliogyro concept (HELIOS) should be capable of demonstrating an order-of-magnitude characteristic acceleration performance improvement over existing solar sail demonstrators (HELIOS target: 0.5 to 1.0 mm/s2 at 1.0 AU); placing the heliogyro technology in the range required to enable a variety of science and human exploration relevant support missions.

  9. Hybrid expert system implementation to determine core reload patterns

    SciTech Connect

    Greek, K.J.; Robinson, A.H.

    1989-01-01

    Determining reactor reload fuel patterns is a computationally intensive problem solving process for which automation can be of significant benefit. Often much effort is expended in the search for an optimal loading. While any modern programming language could be used to automate solution, the specialized tools of artificial intelligence (AI) are the most efficient means of introducing the fuel management expert's knowledge into the search for an optimum reload pattern. Prior research in pressurized water reactor refueling strategies developed FORTRAN programs that automated an expert's basic knowledge to direct a search for an acceptable minimum peak power loading. The dissatisfaction with maintenance of compiled knowledge in FORTRAN programs has served as the motivation for the development of the SHUFFLE expert system. SHUFFLE is written in Smalltalk, an object-oriented programming language, and evaluates loadings as it generates them using a two-group, two-dimensional nodal power calculation compiled in a personal computer-based FORTRAN. This paper reviews the object-oriented representation developed to solve the core reload problem with an expert system tool and its operating prototype, SHUFFLE.

  10. Insulin-like growth factor-1 receptor in mature osteoblasts is required for periosteal bone formation induced by reloading.

    PubMed

    Kubota, Takuo; Elalieh, Hashem Z; Saless, Neema; Fong, Chak; Wang, Yongmei; Babey, Muriel; Cheng, Zhiqiang; Bikle, Daniel D

    2013-11-01

    Skeletal loading and unloading has a pronounced impact on bone remodeling, a process also regulated by insulin-like growth factor 1 (IGF-1) signaling. Skeletal unloading leads to resistance to the anabolic effect of IGF-1, while reloading after unloading restores responsiveness to IGF-1. However, a direct study of the importance of IGF-1 signaling in the skeletal response to mechanical loading remains to be tested. In this study, we assessed the skeletal response of osteoblast-specific Igf-1 receptor deficient (Igf-1r(-/-) ) mice to unloading and reloading. The mice were hindlimb unloaded for 14 days and then reloaded for 16 days. Igf-1r(-/-) mice displayed smaller cortical bone and diminished periosteal and endosteal bone formation at baseline. Periosteal and endosteal bone formation decreased with unloading in Igf-1r(+/+) mice. However, the recovery of periosteal bone formation with reloading was completely inhibited in Igf-1r(-/-) mice, although reloading-induced endosteal bone formation was not hampered. These changes in bone formation resulted in the abolishment of the expected increase in total cross-sectional area with reloading in Igf-1r(-/-) mice compared to the control mice. These results suggest that the Igf-1r in mature osteoblasts has a critical role in periosteal bone formation in the skeletal response to mechanical loading.

  11. Insulin-like growth factor-1 receptor in mature osteoblasts is required for periosteal bone formation induced by reloading

    NASA Astrophysics Data System (ADS)

    Kubota, Takuo; Elalieh, Hashem Z.; Saless, Neema; Fong, Chak; Wang, Yongmei; Babey, Muriel; Cheng, Zhiqiang; Bikle, Daniel D.

    2013-11-01

    Skeletal loading and unloading has a pronounced impact on bone remodeling, a process also regulated by insulin-like growth factor-1 (IGF-1) signaling. Skeletal unloading leads to resistance to the anabolic effect of IGF-1, while reloading after unloading restores responsiveness to IGF-1. However, a direct study of the importance of IGF-1 signaling in the skeletal response to mechanical loading remains to be tested. In this study, we assessed the skeletal response of osteoblast-specific Igf-1 receptor deficient (Igf-1r-/-) mice to unloading and reloading. The mice were hindlimb unloaded for 14 days and then reloaded for 16 days. Igf-1r-/- mice displayed smaller cortical bone and diminished periosteal and endosteal bone formation at baseline. Periosteal and endosteal bone formation decreased with unloading in Igf-1r+/+ mice. However, the recovery of periosteal bone formation with reloading was completely inhibited in Igf-1r-/- mice, although reloading-induced endosteal bone formation was not hampered. These changes in bone formation resulted in the abolishment of the expected increase in total cross-sectional area with reloading in Igf-1r-/- mice compared to the control mice. These results suggest that the Igf-1r in mature osteoblasts has a critical role in periosteal bone formation in the skeletal response to mechanical loading.

  12. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  13. Composite Reliability Enhancement Via Reloading

    DTIC Science & Technology

    1988-09-01

    the design of composite structures , which in turn adds weight and size and causes other related problems that reduce design efficiency. As the...such large structures . This is due to the the lower weak tail of the strength distributions of the con- stituent fibers. This effect has been...tion was run on an IBM Personal Computer using Microsoft Fortran 4.01 for source code and Lotus 1-2-3 for graphing. When the simula- tion program had

  14. Lyophilization process design space.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2013-11-01

    The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  15. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  16. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  17. NASA Collaborative Design Processes

    NASA Technical Reports Server (NTRS)

    Jones, Davey

    2017-01-01

    This is Block 1, the first evolution of the world's most powerful and versatile rocket, the Space Launch System, built to return humans to the area around the moon. Eventually, larger and even more powerful and capable configurations will take astronauts and cargo to Mars. On the sides of the rocket are the twin solid rocket boosters that provide more than 75 percent during liftoff and burn for about two minutes, after which they are jettisoned, lightening the load for the rest of the space flight. Four RS-25 main engines provide thrust for the first stage of the rocket. These are the world's most reliable rocket engines. The core stage is the main body of the rocket and houses the fuel for the RS-25 engines, liquid hydrogen and liquid oxygen, and the avionics, or "brain" of the rocket. The core stage is all new and being manufactured at NASA's "rocket factory," Michoud Assembly Facility near New Orleans. The Launch Vehicle Stage Adapter, or LVSA, connects the core stage to the Interim Cryogenic Propulsion Stage. The Interim Cryogenic Propulsion Stage, or ICPS, uses one RL-10 rocket engine and will propel the Orion spacecraft on its deep-space journey after first-stage separation. Finally, the Orion human-rated spacecraft sits atop the massive Saturn V-sized launch vehicle. Managed out of Johnson Space Center in Houston, Orion is the first spacecraft in history capable of taking humans to multiple destinations within deep space. 2) Each element of the SLS utilizes collaborative design processes to achieve the incredible goal of sending human into deep space. Early phases are focused on feasibility and requirements development. Later phases are focused on detailed design, testing, and operations. There are 4 basic phases typically found in each phase of development.

  18. Translational Signalling, Atrogenic and Myogenic Gene Expression during Unloading and Reloading of Skeletal Muscle in Myostatin-Deficient Mice

    PubMed Central

    Smith, Heather K.; Matthews, Kenneth G.; Oldham, Jenny M.; Jeanplong, Ferenc; Falconer, Shelley J.; Bass, James J.; Senna-Salerno, Mônica; Bracegirdle, Jeremy W.; McMahon, Christopher D.

    2014-01-01

    Skeletal muscles of myostatin null (Mstn(−/−)) mice are more susceptible to atrophy during hind limb suspension (HS) than are muscles of wild-type mice. Here we sought to elucidate the mechanism for this susceptibility and to determine if Mstn(−/−) mice can regain muscle mass after HS. Male Mstn(−/−) and wild-type mice were subjected to 0, 2 or 7 days of HS or 7 days of HS followed by 1, 3 or 7 days of reloading (n = 6 per group). Mstn(−/−) mice lost more mass from muscles expressing the fast type IIb myofibres during HS and muscle mass was recovered in both genotypes after reloading for 7 days. Concentrations of MAFbx and MuRF1 mRNA, crucial ligases regulating the ubiquitin-proteasome system, but not MUSA1, a BMP-regulated ubiquitin ligase, were increased more in muscles of Mstn(−/−) mice, compared with wild-type mice, during HS and concentrations decreased in both genotypes during reloading. Similarly, concentrations of LC3b, Gabarapl1 and Atg4b, key effectors of the autophagy-lysosomal system, were increased further in muscles of Mstn(−/−) mice, compared with wild-type mice, during HS and decreased in both genotypes during reloading. There was a greater abundance of 4E-BP1 and more bound to eIF4E in muscles of Mstn(−/−) compared with wild-type mice (P<0.001). The ratio of phosphorylated to total eIF2α increased during HS and decreased during reloading, while the opposite pattern was observed for rpS6. Concentrations of myogenic regulatory factors (MyoD, Myf5 and myogenin) mRNA were increased during HS in muscles of Mstn(−/−) mice compared with controls (P<0.001). We attribute the susceptibility of skeletal muscles of Mstn(−/−) mice to atrophy during HS to an up- and downregulation, respectively, of the mechanisms regulating atrophy of myofibres and translation of mRNA. These processes are reversed during reloading to aid a faster rate of recovery of muscle mass in Mstn(−/−) mice. PMID:24718581

  19. Bassoon Speeds Vesicle Reloading at a Central Excitatory Synapse

    PubMed Central

    Hallermann, Stefan; Fejtova, Anna; Schmidt, Hartmut; Weyhersmüller, Annika; Silver, R. Angus; Gundelfinger, Eckart D.; Eilers, Jens

    2010-01-01

    Summary Sustained rate-coded signals encode many types of sensory modalities. Some sensory synapses possess specialized ribbon structures, which tether vesicles, to enable high-frequency signaling. However, central synapses lack these structures, yet some can maintain signaling over a wide bandwidth. To analyze the underlying molecular mechanisms, we investigated the function of the active zone core component Bassoon in cerebellar mossy fiber to granule cell synapses. We show that short-term synaptic depression is enhanced in Bassoon knockout mice during sustained high-frequency trains but basal synaptic transmission is unaffected. Fluctuation and quantal analysis as well as quantification with constrained short-term plasticity models revealed that the vesicle reloading rate was halved in the absence of Bassoon. Thus, our data show that the cytomatrix protein Bassoon speeds the reloading of vesicles to release sites at a central excitatory synapse. PMID:21092860

  20. Introducing the "Decider" Design Process

    ERIC Educational Resources Information Center

    Prasa, Anthony R., Jr.; Del Guercio, Ryan

    2016-01-01

    Engineers are faced with solving important problems every day and must follow a step-by-step design process to arrive at solutions. Students who are taught an effective design process to apply to engineering projects begin to see problems as an engineer would, consider all ideas, and arrive at the best solution. Using an effective design process…

  1. Introducing the "Decider" Design Process

    ERIC Educational Resources Information Center

    Prasa, Anthony R., Jr.; Del Guercio, Ryan

    2016-01-01

    Engineers are faced with solving important problems every day and must follow a step-by-step design process to arrive at solutions. Students who are taught an effective design process to apply to engineering projects begin to see problems as an engineer would, consider all ideas, and arrive at the best solution. Using an effective design process…

  2. A Process for Design Engineering

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2004-01-01

    The American Institute of Aeronautics and Astronautics Design Engineering Technical Committee has developed a draft Design Engineering Process with the participation of the technical community. This paper reviews similar engineering activities, lays out common terms for the life cycle and proposes a Design Engineering Process.

  3. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  4. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  5. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  6. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  7. Book Processing Facility Design.

    ERIC Educational Resources Information Center

    Sheahan (Drake)-Stewart Dougall, Marketing and Physical Distribution Consultants, New York, NY.

    The Association of New York Libraries for Technical Services (ANYLTS) is established to develop and run a centralized book processing facility for the public library systems in New York State. ANYLTS plans to receive book orders from the 22 library systems, transmit orders to publishers, receive the volumes from the publishers, print and attach…

  8. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  9. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  10. Automating the automobile design process

    SciTech Connect

    Smith, M.R.

    1986-03-01

    Traditional CAD/CAM speeds product design, analysis, and manufacturing by giving engineers and designers the ability to view and manipulate computer models of automobiles from a variety of perspectives, such as interiors, exteriors, and cross sections. Computer-aided styling (CAS) hastens the automobile design process in the same manner by allowing data to be captured earlier in the design cycle. The goal of CAS is to be able to determine in advance if a design can be aesthetically pleasing - without having to build even the first prototype. Just like CAD/CAM, styling is an iterative process, with CAS techniques speeding the design. Faster iterations mean that more designs can be reviewed and that designers can react more quickly to changing market trends.

  11. Myocardial Reloading after Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    SciTech Connect

    Kajimoto, Masaki; Priddy, Colleen M.; Ledee, Dolena; Xu, Chun; Isern, Nancy G.; Olson, Aaron; Des Rosiers, Christine; Portman, Michael A.

    2013-08-19

    Extracorporeal membrane oxygenation (ECMO) unloads the heart providing a bridge to recovery in children after myocardial stunning. Mortality after ECMO remains high.Cardiac substrate and amino acid requirements upon weaning are unknown and may impact recovery. We assessed the hypothesis that ventricular reloading modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Fourteen immature piglets (7.8-15.6 kg) were separated into 2 groups based on ventricular loading status: 8 hour-ECMO (UNLOAD) and post-wean from ECMO (RELOAD). We infused [2-13C]-pyruvate as an oxidative substrate and [13C6]-L-leucine, as a tracer of amino acid oxidation and protein synthesis into the coronary artery. RELOAD showed marked elevations in myocardial oxygen consumption above baseline and UNLOAD. Pyruvate uptake was markedly increased though RELOAD decreased pyruvate contribution to oxidative CAC metabolism.RELOAD also increased absolute concentrations of all CAC intermediates, while maintaining or increasing 13C-molar percent enrichment. RELOAD also significantly increased cardiac fractional protein synthesis rates by >70% over UNLOAD. Conclusions: RELOAD produced high energy metabolic requirement and rebound protein synthesis. Relative pyruvate decarboxylation decreased with RELOAD while promoting anaplerotic pyruvate carboxylation and amino acid incorporation into protein rather than to the CAC for oxidation. These perturbations may serve as therapeutic targets to improve contractile function after ECMO.

  12. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  13. The motion after-effect reloaded

    PubMed Central

    Mather, George; Pavan, Andrea; Campana, Gianluca; Casco, Clara

    2011-01-01

    The motion after-effect is a robust illusion of visual motion resulting from exposure to a moving pattern. There is a widely accepted explanation of it in terms of changes in the response of cortical direction-selective neurons. Research has distinguished several variants of the effect. Converging recent evidence from different experimental techniques (psychophysics, single-unit recording, brain imaging, transcranial magnetic stimulation, and evoked potentials) reveals that adaptation is not confined to one or even two cortical areas, but involves up to five different sites, reflecting the multiple levels of processing involved in visual motion analysis. A tentative motion processing framework is described, based on motion after-effect research. Recent ideas on the function of adaptation see it as a form of gain control that maximises the efficiency of information transmission. PMID:18951829

  14. Reloading Continuous GPS in Northwest Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez-Garcia, J. J.; Suarez-Vidal, F.; Gonzalez-Ortega, J. A.

    2007-05-01

    For more than 10 years we try to follow the steps of the Southern California Integrated GPS Network (SCIGN) and the Plate Boundary Observatory (PBO) in USA, this gives us the opportunity to be in position to contribute to develop a modern GPS Network in Mexico. During 1998 and 2001, three stations were deployed in Northwest Mexico in concert with the development of SCIGN: SPMX in north central Baja California state at the National Astronomical Observatory, UNAM in the Sierra San Pedro Martir; CORX in Isla Coronados Sur, offshore San Diego, Ca./Tijuana, Mexico and GUAX in Guadalupe island 150 miles offshore Baja California peninsula, which provide a unique site on the Pacific plate in the Northamerica/Pacific boundary zone in Las Californias. The former IGS station in CICESE, Ensenada, CICE installed in 1995, was replaced by CIC1 in 1999. In 2004 and 2005 with partial support from SCIGN and UNAVCO to University of Arizona a volunteer team from UNAVCO, Caltech, U.S. Geological Survey, Universidad de la Sierra at Moctezuma Sonora and CICESE built two new shallow-braced GPS sites in northwest Mexico. The first site USMX is located at east-central Sonora and the second YESX is located high in the Sierra Madre Occidental at Yecora near the southern border of Sonora and Chihuahua. All data is openly available at SOPAC and/or UNAVCO. The existing information has been valuable to resolve the "total" plate motion between the Pacific plate (GUAX) and the Northamerica plate (USMX and YESX) in the north- central Gulf of California. Since the last year we have the capability of GPS data processing using GAMIT/GLOBK, and after gain some practice with survey mode data processing we can convert us in a GPS processing center in Mexico. Currently only 2 sites are operational: CIC1 and USMX. With new energy we are ready to contribute to the establishment of a modern GPS network in Mexico for science, hazard monitoring and infrastructure.

  15. Digital Earth reloaded - Beyond the next generation

    NASA Astrophysics Data System (ADS)

    Ehlers, M.; Woodgate, P.; Annoni, A.; Schade, S.

    2014-02-01

    Digital replicas (or 'mirror worlds') of complex entities and systems are now routine in many fields such as aerospace engineering; archaeology; medicine; or even fashion design. The Digital Earth (DE) concept as a digital replica of the entire planet occurs in Al Gore's 1992 book Earth in the Balance and was popularized in his speech at the California Science Center in January 1998. It played a pivotal role in stimulating the development of a first generation of virtual globes, typified by Google Earth that achieved many elements of this vision. Almost 15 years after Al Gore's speech, the concept of DE needs to be re-evaluated in the light of the many scientific and technical developments in the fields of information technology, data infrastructures, citizen?s participation, and earth observation that have taken place since. This paper intends to look beyond the next generation predominantly based on the developments of fields outside the spatial sciences, where concepts, software, and hardware with strong relationships to DE are being developed without referring to this term. It also presents a number of guiding criteria for future DE developments.

  16. The Snark was a Boojum - reloaded.

    PubMed

    Macrì, Simone; Richter, S Helene

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 ("The Snark was a Boojum"). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of "translational research" that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that developing

  17. The Snark was a Boojum - reloaded

    PubMed Central

    2015-01-01

    In this article, we refer to an original opinion paper written by Prof. Frank Beach in 1950 (“The Snark was a Boojum”). In his manuscript, Beach explicitly criticised the field of comparative psychology because of the disparity between the original understanding of comparativeness and its practical overly specialised implementation. Specialisation encompassed both experimental species (rats accounted for 70% of all subjects) and test paradigms (dominated by conditioning/learning experiments). Herein, we attempt to evaluate the extent to which these considerations apply to current behavioural neuroscience. Such evaluation is particularly interesting in the context of “translational research” that has recently gained growing attention. As a community, we believe that preclinical findings are intended to inform clinical practice at the level of therapies and knowledge advancements. Yet, limited reproducibility of experimental results and failures to translate preclinical research into clinical trial sindicate that these expectations are not entirely fulfilled. Theoretical considerations suggest that, before concluding that a given phenomenon is of relevance to our species, it should be observed in more than a single experimental model (be it an animal strain or species) and tested in more than a single standardized test battery. Yet, current approaches appear limited in terms of variability and overspecialised in terms of operative procedures. Specifically, as in 1950, rodents (mice instead of rats) still constitute the vast majority of animal species investigated. Additionally, the scientific community strives to homogenise experimental test strategies, thereby not only limiting the generalizability of the findings, but also working against the design of innovative approaches. Finally, we discuss the importance of evolutionary-adaptive considerations within the field of laboratory research. Specifically, resting upon empirical evidence indicating that

  18. Reengineering the project design process

    NASA Astrophysics Data System (ADS)

    Kane Casani, E.; Metzger, Robert M.

    1995-01-01

    In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.

  19. Process simulation and design '94

    SciTech Connect

    Not Available

    1994-06-01

    This first-of-a-kind report describes today's process simulation and design technology for specific applications. It includes process names, diagrams, applications, descriptions, objectives, economics, installations, licensors, and a complete list of process submissions. Processes include: alkylation, aromatics extraction, catalytic reforming, cogeneration, dehydration, delayed coking, distillation, energy integration, catalytic cracking, gas sweetening, glycol/methanol injection, hydrocracking, NGL recovery and stabilization, solvent dewaxing, visbreaking. Equipment simulations include: amine plant, ammonia plant, heat exchangers, cooling water network, crude preheat train, crude unit, ethylene furnace, nitrogen rejection unit, refinery, sulfur plant, and VCM furnace. By-product processes include: olefins, polyethylene terephthalate, and styrene.

  20. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  1. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference

  2. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  3. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  4. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  5. Development of Innovative Design Processor

    SciTech Connect

    Park, Y.S.; Park, C.O.

    2004-07-01

    The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which is another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)

  6. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  7. One-dimensional kinetics modifications for BWR reload methods

    SciTech Connect

    Chandola, V.; Robichaud, J.D.

    1990-01-01

    Yankee Atomic Electric Company (YAEC) currently uses RETRAN-02 to analyze limiting transients and establish operating minimum critical power ratio (MCPR) limits for Vermont Yankee (VY) boiling water reactor (BWR) reload analysis. The US Nuclear Regulatory Commission-approved analysis methods, used in previous cycles, use the point-kinetics modeling option in RETRAN-02 to represent transient-induced neutronic feedback. RETRAN-02 also contains a one-dimensional (1-D) kinetics neutronic feedback model option that provides a more accurate transient power prediction than the point-kinetics model. In the past few fuel cycles, the thermal or MCPR operating margin at VY has eroded due to increases in fuel cycle length. To offset this decrease, YAEC has developed the capability to use the more accurate 1-D kinetics RETRAN option. This paper reviews the qualification effort for the YAEC BWR methods. This paper also presents a comparison between RETRAN-02 predictions using 1-D and point kinetics for the limiting transient, and demonstrates the typical gain in thermal margin from 1-D kinetics.

  8. Conceptual Chemical Process Design for Sustainability.

    EPA Science Inventory

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyse...

  9. Muscle regeneration during hindlimb unloading results in a reduction in muscle size after reloading

    NASA Technical Reports Server (NTRS)

    Mozdziak, P. E.; Pulvermacher, P. M.; Schultz, E.

    2001-01-01

    The hindlimb-unloading model was used to study the ability of muscle injured in a weightless environment to recover after reloading. Satellite cell mitotic activity and DNA unit size were determined in injured and intact soleus muscles from hindlimb-unloaded and age-matched weight-bearing rats at the conclusion of 28 days of hindlimb unloading, 2 wk after reloading, and 9 wk after reloading. The body weights of hindlimb-unloaded rats were significantly (P < 0.05) less than those of weight-bearing rats at the conclusion of hindlimb unloading, but they were the same (P > 0.05) as those of weight-bearing rats 2 and 9 wk after reloading. The soleus muscle weight, soleus muscle weight-to-body weight ratio, myofiber diameter, number of nuclei per millimeter, and DNA unit size were significantly (P < 0.05) smaller for the injured soleus muscles from hindlimb-unloaded rats than for the soleus muscles from weight-bearing rats at each recovery time. Satellite cell mitotic activity was significantly (P < 0.05) higher in the injured soleus muscles from hindlimb-unloaded rats than from weight-bearing rats 2 wk after reloading, but it was the same (P > 0.05) as in the injured soleus muscles from weight-bearing rats 9 wk after reloading. The injured soleus muscles from hindlimb-unloaded rats failed to achieve weight-bearing muscle size 9 wk after reloading, because incomplete compensation for the decrease in myonuclear accretion and DNA unit size expansion occurred during the unloading period.

  10. Muscle regeneration during hindlimb unloading results in a reduction in muscle size after reloading

    NASA Technical Reports Server (NTRS)

    Mozdziak, P. E.; Pulvermacher, P. M.; Schultz, E.

    2001-01-01

    The hindlimb-unloading model was used to study the ability of muscle injured in a weightless environment to recover after reloading. Satellite cell mitotic activity and DNA unit size were determined in injured and intact soleus muscles from hindlimb-unloaded and age-matched weight-bearing rats at the conclusion of 28 days of hindlimb unloading, 2 wk after reloading, and 9 wk after reloading. The body weights of hindlimb-unloaded rats were significantly (P < 0.05) less than those of weight-bearing rats at the conclusion of hindlimb unloading, but they were the same (P > 0.05) as those of weight-bearing rats 2 and 9 wk after reloading. The soleus muscle weight, soleus muscle weight-to-body weight ratio, myofiber diameter, number of nuclei per millimeter, and DNA unit size were significantly (P < 0.05) smaller for the injured soleus muscles from hindlimb-unloaded rats than for the soleus muscles from weight-bearing rats at each recovery time. Satellite cell mitotic activity was significantly (P < 0.05) higher in the injured soleus muscles from hindlimb-unloaded rats than from weight-bearing rats 2 wk after reloading, but it was the same (P > 0.05) as in the injured soleus muscles from weight-bearing rats 9 wk after reloading. The injured soleus muscles from hindlimb-unloaded rats failed to achieve weight-bearing muscle size 9 wk after reloading, because incomplete compensation for the decrease in myonuclear accretion and DNA unit size expansion occurred during the unloading period.

  11. Reloading partly recovers bone mineral density and mechanical properties in hind limb unloaded rats

    NASA Astrophysics Data System (ADS)

    Zhao, Fan; Li, Dijie; Arfat, Yasir; Chen, Zhihao; Liu, Zonglin; Lin, Yu; Ding, Chong; Sun, Yulong; Hu, Lifang; Shang, Peng; Qian, Airong

    2014-12-01

    Skeletal unloading results in decreased bone formation and bone mass. During long-term space flight, the decreased bone mass is impossible to fully recover. Therefore, it is necessary to develop the effective countermeasures to prevent spaceflight-induced bone loss. Hindlimb Unloading (HLU) simulates effects of weightlessness and is utilized extensively to examine the response of musculoskeletal systems to certain aspects of space flight. The purpose of this study is to investigate the effects of a 4-week HLU in rats and subsequent reloading on the bone mineral density (BMD) and mechanical properties of load-bearing bones. After HLU for 4 weeks, the rats were then subjected to reloading for 1 week, 2 weeks and 3 weeks, and then the BMD of the femur, tibia and lumbar spine in rats were assessed by dual energy X-ray absorptiometry (DXA) every week. The mechanical properties of the femur were determined by three-point bending test. Dry bone and bone ash of femur were obtained through Oven-Drying method and were weighed respectively. Serum alkaline phosphatase (ALP) and serum calcium were examined through ELISA and Atomic Absorption Spectrometry. The results showed that 4 weeks of HLU significantly decreased body weight of rats and reloading for 1 week, 2 weeks or 3 weeks did not recover the weight loss induced by HLU. However, after 2 weeks of reloading, BMD of femur and tibia of HLU rats partly recovered (+10.4%, +2.3%). After 3 weeks of reloading, the reduction of BMD, energy absorption, bone mass and mechanical properties of bone induced by HLU recovered to some extent. The changes in serum ALP and serum calcium induced by HLU were also recovered after reloading. Our results indicate that a short period of reloading could not completely recover bone after a period of unloading, thus some interventions such as mechanical vibration or pharmaceuticals are necessary to help bone recovery.

  12. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  13. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  14. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  15. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  16. 9 CFR 325.18 - Diverting of shipments, breaking of seals, and reloading by carrier in emergency; reporting to...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... seals, and reloading by carrier in emergency; reporting to Regional Director. 325.18 Section 325.18... CERTIFICATION TRANSPORTATION § 325.18 Diverting of shipments, breaking of seals, and reloading by carrier in...) In case of wreck or similar extraordinary emergency, the Department seals on a railroad car or...

  17. The experience of using Endo GIA™ Radial Reload with Tri-Staple™ Technology for various lung surgery.

    PubMed

    Ema, Toshinari

    2014-10-01

    Endo GIA™ Radial Reload with Tri-Staple™ Technology (RR) is a device for colorectal surgery. However, with its rounded staple line, Radial Reload is suitable for various lung surgeries. We use the device for lung wedge resection, and cutting bronchus in lung lobectomy. The total number of use counts up to 56 fires, and all fires came out well.

  18. Intracellular Ca2+ transients in mouse soleus muscle after hindlimb unloading and reloading

    NASA Technical Reports Server (NTRS)

    Ingalls, C. P.; Warren, G. L.; Armstrong, R. B.; Hamilton, S. L. (Principal Investigator)

    1999-01-01

    The objective of this study was to determine whether altered intracellular Ca(2+) handling contributes to the specific force loss in the soleus muscle after unloading and/or subsequent reloading of mouse hindlimbs. Three groups of female ICR mice were studied: 1) unloaded mice (n = 11) that were hindlimb suspended for 14 days, 2) reloaded mice (n = 10) that were returned to their cages for 1 day after 14 days of hindlimb suspension, and 3) control mice (n = 10) that had normal cage activity. Maximum isometric tetanic force (P(o)) was determined in the soleus muscle from the left hindlimb, and resting free cytosolic Ca(2+) concentration ([Ca(2+)](i)), tetanic [Ca(2+)](i), and 4-chloro-m-cresol-induced [Ca(2+)](i) were measured in the contralateral soleus muscle by confocal laser scanning microscopy. Unloading and reloading increased resting [Ca(2+)](i) above control by 36% and 24%, respectively. Although unloading reduced P(o) and specific force by 58% and 24%, respectively, compared with control mice, there was no difference in tetanic [Ca(2+)](i). P(o), specific force, and tetanic [Ca(2+)](i) were reduced by 58%, 23%, and 23%, respectively, in the reloaded animals compared with control mice; however, tetanic [Ca(2+)](i) was not different between unloaded and reloaded mice. These data indicate that although hindlimb suspension results in disturbed intracellular Ca(2+) homeostasis, changes in tetanic [Ca(2+)](i) do not contribute to force deficits. Compared with unloading, 24 h of physiological reloading in the mouse do not result in further changes in maximal strength or tetanic [Ca(2+)](i).

  19. Myocardial reloading after extracorporeal membrane oxygenation alters substrate metabolism while promoting protein synthesis.

    PubMed

    Kajimoto, Masaki; O'Kelly Priddy, Colleen M; Ledee, Dolena R; Xu, Chun; Isern, Nancy; Olson, Aaron K; Des Rosiers, Christine; Portman, Michael A

    2013-08-19

    Extracorporeal membrane oxygenation (ECMO) unloads the heart, providing a bridge to recovery in children after myocardial stunning. ECMO also induces stress which can adversely affect the ability to reload or wean the heart from the circuit. Metabolic impairments induced by altered loading and/or stress conditions may impact weaning. However, cardiac substrate and amino acid requirements upon weaning are unknown. We assessed the hypothesis that ventricular reloading with ECMO modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Sixteen immature piglets (7.8 to 15.6 kg) were separated into 2 groups based on ventricular loading status: 8-hour ECMO (UNLOAD) and postwean from ECMO (RELOAD). We infused into the coronary artery [2-(13)C]-pyruvate as an oxidative substrate and [(13)C6]-L-leucine as an indicator for amino acid oxidation and protein synthesis. Upon RELOAD, each functional parameter, which were decreased substantially by ECMO, recovered to near-baseline level with the exclusion of minimum dP/dt. Accordingly, myocardial oxygen consumption was also increased, indicating that overall mitochondrial metabolism was reestablished. At the metabolic level, when compared to UNLOAD, RELOAD altered the contribution of various substrates/pathways to tissue pyruvate formation, favoring exogenous pyruvate versus glycolysis, and acetyl-CoA formation, shifting away from pyruvate decarboxylation to endogenous substrate, presumably fatty acids. Furthermore, there was also a significant increase of tissue concentrations for all CAC intermediates (≈80%), suggesting enhanced anaplerosis, and of fractional protein synthesis rates (>70%). RELOAD alters both cytosolic and mitochondrial energy substrate metabolism, while favoring leucine incorporation into protein synthesis rather than oxidation in the CAC. Improved understanding of factors governing these metabolic perturbations may serve as a basis for interventions and thereby improve

  20. Myocardial Reloading After Extracorporeal Membrane Oxygenation Alters Substrate Metabolism While Promoting Protein Synthesis

    PubMed Central

    Kajimoto, Masaki; O'Kelly Priddy, Colleen M.; Ledee, Dolena R.; Xu, Chun; Isern, Nancy; Olson, Aaron K.; Rosiers, Christine Des; Portman, Michael A.

    2013-01-01

    Background Extracorporeal membrane oxygenation (ECMO) unloads the heart, providing a bridge to recovery in children after myocardial stunning. ECMO also induces stress which can adversely affect the ability to reload or wean the heart from the circuit. Metabolic impairments induced by altered loading and/or stress conditions may impact weaning. However, cardiac substrate and amino acid requirements upon weaning are unknown. We assessed the hypothesis that ventricular reloading with ECMO modulates both substrate entry into the citric acid cycle (CAC) and myocardial protein synthesis. Methods and Results Sixteen immature piglets (7.8 to 15.6 kg) were separated into 2 groups based on ventricular loading status: 8‐hour ECMO (UNLOAD) and postwean from ECMO (RELOAD). We infused into the coronary artery [2‐13C]‐pyruvate as an oxidative substrate and [13C6]‐L‐leucine as an indicator for amino acid oxidation and protein synthesis. Upon RELOAD, each functional parameter, which were decreased substantially by ECMO, recovered to near‐baseline level with the exclusion of minimum dP/dt. Accordingly, myocardial oxygen consumption was also increased, indicating that overall mitochondrial metabolism was reestablished. At the metabolic level, when compared to UNLOAD, RELOAD altered the contribution of various substrates/pathways to tissue pyruvate formation, favoring exogenous pyruvate versus glycolysis, and acetyl‐CoA formation, shifting away from pyruvate decarboxylation to endogenous substrate, presumably fatty acids. Furthermore, there was also a significant increase of tissue concentrations for all CAC intermediates (≈80%), suggesting enhanced anaplerosis, and of fractional protein synthesis rates (>70%). Conclusions RELOAD alters both cytosolic and mitochondrial energy substrate metabolism, while favoring leucine incorporation into protein synthesis rather than oxidation in the CAC. Improved understanding of factors governing these metabolic perturbations may

  1. Temporal changes in sarcomere lesions of rat adductor longus muscles during hindlimb reloading

    NASA Technical Reports Server (NTRS)

    Krippendorf, B. B.; Riley, D. A.

    1994-01-01

    Focal sarcomere disruptions were previously observed in adductor longus muscles of rats flown approximately two weeks aboard the Cosmos 1887 and 2044 biosatellite flights. These lesions, characterized by breakage and loss of myofilaments and Z-line streaming, resembled damage induced by unaccustomed exercise that includes eccentric contractions in which muscles lengthen as they develop tension. We hypothesized that sarcomere lesions in atrophied muscles of space flow rats were not produced in microgravity by muscle unloading but resulted from muscle reloading upon re-exposure to terrestrial gravity. To test this hypothesis, we examined temporal changes in sarcomere integrity of adductor longus muscles from rats subjected to 12.5 days of hindlimb suspension unloading and subsequent reloading by return to vivarium cages for 0, 6, 12, or 48 hours of normal weightbearing. Our ultrastructural observations suggested that muscle unloading (0 h reloading) induced myofibril misalignment associated with myofiber atrophy. Muscle reloading for 6 hours induced focal sarcomere lesions in which cross striations were abnormally widened. Such lesions were electron lucent due to extensive myofilament loss. Lesions in reloaded muscles showed rapid restructuring. By 12 hours of reloading, lesions were moderately stained foci and by 48 hours darkly stained foci in which the pattern of cross striations was indistinct at the light and electron microscopic levels. These lesions were spanned by Z-line-like electron dense filamentous material. Our findings suggest a new role for Z-line streaming in lesion restructuring: rather than an antecedent to damage, this type of Z-line streaming may be indicative of rapid, early sarcomere repair.

  2. Intracellular Ca2+ transients in mouse soleus muscle after hindlimb unloading and reloading

    NASA Technical Reports Server (NTRS)

    Ingalls, C. P.; Warren, G. L.; Armstrong, R. B.; Hamilton, S. L. (Principal Investigator)

    1999-01-01

    The objective of this study was to determine whether altered intracellular Ca(2+) handling contributes to the specific force loss in the soleus muscle after unloading and/or subsequent reloading of mouse hindlimbs. Three groups of female ICR mice were studied: 1) unloaded mice (n = 11) that were hindlimb suspended for 14 days, 2) reloaded mice (n = 10) that were returned to their cages for 1 day after 14 days of hindlimb suspension, and 3) control mice (n = 10) that had normal cage activity. Maximum isometric tetanic force (P(o)) was determined in the soleus muscle from the left hindlimb, and resting free cytosolic Ca(2+) concentration ([Ca(2+)](i)), tetanic [Ca(2+)](i), and 4-chloro-m-cresol-induced [Ca(2+)](i) were measured in the contralateral soleus muscle by confocal laser scanning microscopy. Unloading and reloading increased resting [Ca(2+)](i) above control by 36% and 24%, respectively. Although unloading reduced P(o) and specific force by 58% and 24%, respectively, compared with control mice, there was no difference in tetanic [Ca(2+)](i). P(o), specific force, and tetanic [Ca(2+)](i) were reduced by 58%, 23%, and 23%, respectively, in the reloaded animals compared with control mice; however, tetanic [Ca(2+)](i) was not different between unloaded and reloaded mice. These data indicate that although hindlimb suspension results in disturbed intracellular Ca(2+) homeostasis, changes in tetanic [Ca(2+)](i) do not contribute to force deficits. Compared with unloading, 24 h of physiological reloading in the mouse do not result in further changes in maximal strength or tetanic [Ca(2+)](i).

  3. Hafnium transistor process design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2009-01-01

    A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.

  4. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  5. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  6. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  7. Gaps in the Design Process

    SciTech Connect

    Veers, Paul

    2016-10-04

    The design of offshore wind plants is a relatively new field. The move into U.S. waters will have unique environmental conditions, as well as expectations from the authorities responsible for managing the development. Wind turbines are required to test their assumed design conditions with the site conditions of the plant. There are still some outstanding issues on how we can assure that the design for both the turbine and the foundation are appropriate for the site and will have an acceptable level of risk associated with the particular installation.

  8. Launch Vehicle Design Process Characterization Enables Design/Project Tool

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Robinson, Nancy (Technical Monitor)

    2001-01-01

    The objectives of the project described in this viewgraph presentation included the following: (1) Provide an overview characterization of the launch vehicle design process; and (2) Delineate design/project tool to identify, document, and track pertinent data.

  9. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  10. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  11. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  12. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  13. 2D Mesoscale Simulations of Quasielastic Reloading and Unloading in Shock Compressed Aluminum

    NASA Astrophysics Data System (ADS)

    Dwivedi, S. K.

    2007-06-01

    2D mesoscale simulations of planar shock compression, followed by either reloading or unloading, are presented that predict quasi-elastic (QE) response observed experimentally in shocked polycrystalline aluminum. The representative volume element (RVE) of the plate impact experiments included a realistic representation of a grain ensemble with apparent heterogeneities in the polycrystalline sample. Simulations were carried out using a 2D updated Lagrangian finite element code ISP-TROTP incorporating elastic-plastic deformation in grain interior and contact/cohesive methodology to analyze finite strength grain boundaries. Local heterogeneous response was quantified by calculating appropriate material variables along in-situ Lagrangian tracer lines and comparing the temporal variation of their mean values with results from 2D continuum simulations. Simulations were carried out by varying a large number of individual heterogeneities to predict QE response on reloading and unloading from shock state. The heterogeneities important for simulating the QE response identified from these simulations were: hardened grain boundaries, hard inclusions, and micro-porosity. It is shown that the shock-deformed state of polycrystalline aluminum in the presence of these effects is strongly heterogeneous with considerable variations in lateral stresses. This distributed stress state unloads the shear stress from flow stress causing QE response on reloading as well as unloading. The simulated velocity profiles and calculated shear strength and shear stresses for a representative reloading and unloading experimental configuration were found to agree well with the reported experimental data. Work supported by DOE.

  14. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  15. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  16. NANEX: Process design and optimization.

    PubMed

    Baumgartner, Ramona; Matić, Josip; Schrank, Simone; Laske, Stephan; Khinast, Johannes; Roblegg, Eva

    2016-06-15

    Previously, we introduced a one-step nano-extrusion (NANEX) process for transferring aqueous nano-suspensions into solid formulations directly in the liquid phase. Nano-suspensions were fed into molten polymers via a side-feeding device and excess water was eliminated via devolatilization. However, the drug content in nano-suspensions is restricted to 30 % (w/w), and obtaining sufficiently high drug loadings in the final formulation requires the processing of high water amounts and thus a fundamental process understanding. To this end, we investigated four polymers with different physicochemical characteristics (Kollidon(®) VA64, Eudragit(®) E PO, HPMCAS and PEG 20000) in terms of their maximum water uptake/removal capacity. Process parameters as throughput and screw speed were adapted and their effect on the mean residence time and filling degree was studied. Additionally, one-dimensional discretization modeling was performed to examine the complex interactions between the screw geometry and the process parameters during water addition/removal. It was established that polymers with a certain water miscibility/solubility can be manufactured via NANEX. Long residence times of the molten polymer in the extruder and low filling degrees in the degassing zone favored the addition/removal of significant amounts of water. The residual moisture content in the final extrudates was comparable to that of extrudates manufactured without water. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  18. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  19. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Perceptions of Instructional Design Process Models.

    ERIC Educational Resources Information Center

    Branch, Robert Maribe

    Instructional design is a process that is creative, active, iterative and complex; however, many diagrams of instructional design are interpreted as stifling, passive, lock-step and simple because of the visual elements used to model the process. The purpose of this study was to determine the expressed perceptions of the types of flow diagrams…

  1. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  2. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  3. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  4. IMPLEMENTING THE SAFEGUARDS-BY-DESIGN PROCESS

    SciTech Connect

    Whitaker, J Michael; McGinnis, Brent; Laughter, Mark D; Morgan, Jim; Bjornard, Trond; Bean, Robert; Durst, Phillip; Hockert, John; DeMuth, Scott; Lockwood, Dunbar

    2010-01-01

    The Safeguards-by-Design (SBD) approach incorporates safeguards into the design and construction of nuclear facilities at the very beginning of the design process. It is a systematic and structured approach for fully integrating international and national safeguards for material control and accountability (MC&A), physical protection, and other proliferation barriers into the design and construction process for nuclear facilities. Implementing SBD is primarily a project management or project coordination challenge. This paper focuses specifically on the design process; the planning, definition, organization, coordination, scheduling and interaction of the safeguards experts and stakeholders as they participate in the design and construction of a nuclear facility. It delineates the steps in a nuclear facility design and construction project in order to provide the project context within which the safeguards design activities take place, describes the involvement of the safeguards experts in the design process, the nature of their analyses, interactions and decisions, and describes the documents created and how they are used. This report highlights the project context of safeguards activities, and identifies the safeguards community (nuclear facility operator, designer/builder, state regulator, SSAC and IAEA) must accomplish in order to implement SBD within the project.

  5. Design Process Improvement for Electric CAR Harness

    NASA Astrophysics Data System (ADS)

    Sawatdee, Thiwarat; Chutima, Parames

    2017-06-01

    In an automobile parts design company, the customer satisfaction is one of the most important factors for product design. Therefore, the company employs all means to focus its product design process based on the various requirements of customers resulting in high number of design changes. The objective of this research is to improve the design process of the electric car harness that effects the production scheduling by using Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) as the main tools. FTA is employed for root cause analysis and FMEA is used to ranking a High Risk Priority Number (RPN) which is shows the priority of factors in the electric car harness that have high impact to the design of the electric car harness. After the implementation, the improvements are realized significantly since the number of design change is reduced from 0.26% to 0.08%.

  6. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition.

    PubMed

    Imbir, Kamil K

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples.

  7. Affective Norms for 4900 Polish Words Reload (ANPW_R): Assessments for Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability and, Age of Acquisition

    PubMed Central

    Imbir, Kamil K.

    2016-01-01

    In studies that combine understanding of emotions and language, there is growing demand for good-quality experimental materials. To meet this expectation, a large number of 4905 Polish words was assessed by 400 participants in order to provide a well-established research method for everyone interested in emotional word processing. The Affective Norms for Polish Words Reloaded (ANPW_R) is designed as an extension to the previously introduced the ANPW dataset and provides assessments for eight different affective and psycholinguistic measures of Valence, Arousal, Dominance, Origin, Significance, Concreteness, Imageability, and subjective Age of Acquisition. The ANPW_R is now the largest available dataset of affective words for Polish, including affective scores that have not been measured in any other dataset (concreteness and age of acquisition scales). Additionally, the ANPW_R allows for testing hypotheses concerning dual-mind models of emotion and activation (origin and subjective significance scales). Participants in the current study assessed all 4905 words in the list within 1 week, at their own pace in home sessions, using eight different Self-assessment Manikin (SAM) scales. Each measured dimension was evaluated by 25 women and 25 men. The ANPW_R norms appeared to be reliable in split-half estimation and congruent with previous normative studies in Polish. The quadratic relation between valence and arousal was found to be in line with previous findings. In addition, nine other relations appeared to be better described by quadratic instead of linear function. The ANPW_R provides well-established research materials for use in psycholinguistic and affective studies in Polish-speaking samples. PMID:27486423

  8. An Integrated Course and Design Project in Chemical Process Design.

    ERIC Educational Resources Information Center

    Rockstraw, David A.; And Others

    1997-01-01

    Describes a chemical engineering course curriculum on process design, analysis, and simulation. Includes information regarding the sequencing of engineering design classes and the location of the classes within the degree program at New Mexico State University. Details of course content are provided. (DDR)

  9. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  10. Effects of Unloading and Reloading on Expressions of Skelatal Muscle Membrane Proteins in Mice

    NASA Astrophysics Data System (ADS)

    Ohno, Y.; Ikuta, A.; Goto, A.; Sugiura, T.; Ohira, Y.; Yoshioka, T.; Goto, K.

    2013-02-01

    Effects of unloading and reloading on the expression levels of tripartite motif-containing 72 (TRIM72) and caveolin-3 (Cav-3) of soleus muscle in mice were investigated. Male C57BL/6J mice (11-week old) were randomly assigned to control and hindlimb-suspended groups. Some of mice in hindlimb-suspended group were subjected to continuous hindlimb suspension (HS) for 2 weeks with or without 7 days of ambulation recovery. Following HS, the muscle weight and protein expression levels of TRIM72 and Cav-3 in soleus were decreased. On the other hand, the gradual increases in muscle mass, TRIM72 and Cav-3 were observed after reloading following HS. Therefore, it was suggested that mechanical loading played a key role in a regulatory system for protein expressions of TRIM72 and Cav-3.

  11. Real-Time Scheduling in Heterogeneous Systems Considering Cache Reload Time Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Miryani, Mohammad Reza; Naghibzadeh, Mahmoud

    Since optimal assignment of tasks in a multiprocessor system is, in almost all practical cases, an NP-hard problem, in recent years some algorithms based on genetic algorithms have been proposed. Some of these algorithms have considered real-time applications with multiple objectives, total tardiness, completion time, etc. Here, we propose a suboptimal static scheduler of nonpreemptable tasks in hard real-time heterogeneous multiprocessor systems considering time constraints and cache reload time. The approach makes use of genetic algorithm to minimize total completion time and number of processors used, simultaneously. One important issue which makes this research different from previous ones is cache reload time. The method is implemented and the results are compared against a similar method.

  12. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  13. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  14. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  15. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  16. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  17. User-Centered Design (UCD) Process Description

    DTIC Science & Technology

    2014-12-01

    TECHNICAL REPORT 2061 December 2014 User-Centered Design (UCD) Process Description Michael Cowen Alan Lemon Deborah...CA. Available online at http://www.dodccrp.org/events/15th_iccrts_2010/papers/033.pdf. Accessed 11/25/2014. 2. A. G. Lemon and M. B. Cowen. 2012. “A...Prescribed by ANSI Std. Z39.18 Decemb December 2014 Final User-Centered Design (USD) Process Description Michael Cowen Alan Lemon

  18. The process road between requirements and design

    SciTech Connect

    Goedicke, M.; Nuseibeh, B.

    1996-12-31

    The software engineering literature contains many examples of methods, tools and techniques that claim to facilitate a variety of requirements engineering and design activities. Guidance on how these activities are related within a coherent software development process is much less apparent. A central problem that makes such guidance difficult to achieve is that requirements engineering addresses problem domains whereas design addresses solution domains. This is in the face of frequent changes in requirements contrasted with the need for stable design solutions.

  19. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  20. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  1. Action potential duration determines sarcoplasmic reticulum Ca2+ reloading in mammalian ventricular myocytes

    PubMed Central

    Bassani, Rosana A; Altamirano, Julio; Puglisi, José L; Bers, Donald M

    2004-01-01

    After sarcoplasmic reticulum (SR) Ca2+ depletion in intact ventricular myocytes, electrical activity promotes SR Ca2+ reloading and recovery of twitch amplitude. In ferret, recovery of twitch and caffeine-induced contracture required fewer twitches than in rabbit or rat. In rat, there was no difference in action potential duration at 90% repolarization (APD90) at steady state (SS) versus at the first post-depletion (PD) twitch. The SS APD90 was similar in ferret and rabbit (but longer than in rat). However, compared to SS, the PD APD90 was lengthened in ferret, but shortened in rabbit. When rabbit myocytes were subjected to AP-clamp patterns during SR Ca2+ reloading (ferret- or rabbit-type APs), reloading was much faster using the ferret AP templates. We conclude that the faster SR Ca2+ refilling in ferret is due to the increased Ca2+ influx during the longer PD AP. The PD versus SS APD90 difference was suppressed by thapsigargin in ferret (indicating Ca2+ dependence). In rabbit, the PD AP shortening depended on the preceding diastolic interval (rather than Ca2+), because rest produced the same AP shortening, and SS APD90 increased as a function of frequency (in contrast to ferret). Transient outward current (Ito) was larger and recovered from inactivation much faster in ferret than in rabbit. Moreover, slow Ito recovery (τ ∼ 3 s) in rabbit was a much larger fraction of Ito. Our data and a computational model (including two Ito components) suggest that in rabbit the slowly recovering Ito is responsible for short post-rest and PD APs, for the unusual frequency dependence of APD90, and ultimately for the slower post-depletion SR Ca2+ reloading. PMID:15243136

  2. Kinetics and Muscle Activity Patterns during Unweighting and Reloading Transition Phases in Running

    PubMed Central

    Sainton, Patrick; Nicol, Caroline; Cabri, Jan; Barthèlemy-Montfort, Joëlle; Chavet, Pascale

    2016-01-01

    Amongst reduced gravity simulators, the lower body positive pressure (LBPP) treadmill is emerging as an innovative tool for both rehabilitation and fundamental research purposes as it allows running while experiencing reduced vertical ground reaction forces. The appropriate use of such a treadmill requires an improved understanding of the associated neuromechanical changes. This study concentrates on the runner’s adjustments to LBPP-induced unweighting and reloading during running. Nine healthy males performed two running series of nine minutes at natural speed. Each series comprised three sequences of three minutes at: 100% bodyweight (BW), 60 or 80% BW, and 100% BW. The progressive unweighting and reloading transitions lasted 10 to 15 s. The LBPP-induced unweighting level, vertical ground reaction force and center of mass accelerations were analyzed together with surface electromyographic activity from 6 major lower limb muscles. The analyses of stride-to-stride adjustments during each transition established highly linear relationships between the LBPP-induced progressive changes of BW and most mechanical parameters. However, the impact peak force and the loading rate systematically presented an initial 10% increase with unweighting which could result from a passive mechanism of leg retraction. Another major insight lies in the distinct neural adjustments found amongst the recorded lower-limb muscles during the pre- and post-contact phases. The preactivation phase was characterized by an overall EMG stability, the braking phase by decreased quadriceps and soleus muscle activities, and the push-off phase by decreased activities of the shank muscles. These neural changes were mirrored during reloading. These neural adjustments can be attributed in part to the lack of visual cues on the foot touchdown. These findings highlight both the rapidity and the complexity of the neuromechanical changes associated with LBPP-induced unweighting and reloading during running

  3. Agonist-sensitive calcium pool in the pancreatic acinar cell. II. Characterization of reloading

    SciTech Connect

    Muallem, S.; Schoeffield, M.S.; Fimmel, C.J.; Pandol, S.J.

    1988-08-01

    45Ca2+ fluxes and free cytosolic Ca2+ measurements in guinea pig pancreatic acini indicated that after agonist stimulation and the release of Ca2+ from the agonist-sensitive pool at least part of the Ca2+ is extruded from the cell, resulting in 45Ca2+ efflux. In the continued presence of agonist, the pool remains permeable to Ca2+ but partially refills with Ca2+. This reloading is dependent on the concentration of extracellular Ca2+. In the absence of extracellular Ca2+, the pool is completely depleted of Ca2+. However, with increasing concentrations of CaCl2 in the incubation solution (from 0.5 to 2.0 mM) there is increasing repletion of the pool with Ca2+ during agonist stimulation. With termination of agonist stimulation, the Ca2+ permeability of the agonist-sensitive pool is rapidly reduced to that measured in the unstimulated cell. As a result, the Ca2+ incorporated into the pool during the stimulation period is rapidly trapped within the pool and exchanges poorly with medium Ca2+. Subsequently, the pool completely refills with Ca2+. The rate of Ca2+ reloading at the termination of agonist stimulation is slower than the conversion of the pool to the impermeable state. In incubation media containing 1.3 mM CaCl2, the half-time for reloading at the termination of stimulation is 5 min. These observations demonstrate the characteristics of Ca2+ reloading of the agonist-sensitive pool both during stimulation and at the termination of stimulation.

  4. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  5. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  6. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  7. Logical Reloading. What is it and What is a Profit from it?

    NASA Astrophysics Data System (ADS)

    Rylov, Yuri A.

    2014-07-01

    Logical reloading is a replacement of basic statements of a conception by equivalent statements of the same conception. The logical reloading does not change the conception, but it changes the mathematical formalism and changes results of this conception generalization. In the paper two examples of the logical reloading are considered. (1) Generalization of the deterministic particle dynamics on the case of the stochastic particle dynamics. As a result the unified formalism for description of particles of all kinds appears. This formalism admits one to explain freely quantum dynamics in terms of the classical particle dynamics. In particular, one discovers κ-field responsible for pair production. (2) Generalization of the proper Euclidean geometry which contains such space-time geometries, where free particles move stochastically. As a result such a conception of elementary particle dynamics arises, where one can investigate the elementary particles arrangement, but not only systematize elementary particles, ascribing quantum numbers to them. Besides, one succeeds to expand the general relativity on the non-Riemannian space-time geometries.

  8. Osteocyte-viability-based simulations of trabecular bone loss and recovery in disuse and reloading.

    PubMed

    Wang, Hong; Ji, Baohua; Liu, X Sherry; van Oers, René F M; Guo, X Edward; Huang, Yonggang; Hwang, Keh-Chih

    2014-01-01

    Osteocyte apoptosis is known to trigger targeted bone resorption. In the present study, we developed an osteocyte-viability-based trabecular bone remodeling (OVBR) model. This novel remodeling model, combined with recent advanced simulation methods and analysis techniques, such as the element-by-element 3D finite element method and the ITS technique, was used to quantitatively study the dynamic evolution of bone mass and trabecular microstructure in response to various loading and unloading conditions. Different levels of unloading simulated the disuse condition of bed rest or microgravity in space. The amount of bone loss and microstructural deterioration correlated with the magnitude of unloading. The restoration of bone mass upon the reloading condition was achieved by thickening the remaining trabecular architecture, while the lost trabecular plates and rods could not be recovered by reloading. Compared to previous models, the predictions of bone resorption of the OVBR model are more consistent with physiological values reported from previous experiments. Whereas osteocytes suffer a lack of loading during disuse, they may suffer overloading during the reloading phase, which hampers recovery. The OVBR model is promising for quantitative studies of trabecular bone loss and microstructural deterioration of patients or astronauts during long-term bed rest or space flight and thereafter bone recovery.

  9. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  10. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  11. Interactive graphics, the design process, and education

    SciTech Connect

    Norton, F.J.

    1980-09-01

    The field of design and drafting is changing continuously - its parameters are ever shifting and its applications are increasing. The use of Computer-Aided Design (CAD) and Computer-Aided Manufacturing (CAM) is becoming increasingly common in industry. However, instruction in CAD and CAM has in general not been incorporated into university curricula. This paper addresses the need for increased instruction in interactive graphics at the student level, and particularly in conjunction with the design process used by engineers, designers, and drafters. The development of three-dimensional graphical models using CAD is seen as a vital part of product development. Applications to printed circuit design and numerical control (NC) operations are discussed. Effective educational programs in the use of CAD must relate to designers, users, and managers and may be developed either by industry or academia. Possible approaches to new programs include coursework, projects involving CAD, and special collaborative efforts between industry and academic institutions. 1 figure.

  12. Inhibition of inflammation mediates the protective effect of atorvastatin reload in patients with coronary artery disease undergoing noncardiac emergency surgery.

    PubMed

    Qu, Yang; Wei, Lixin; Zhang, Haiqing

    2014-12-01

    This study aimed to (a) investigate whether atorvastatin reload protects against acute heart failure (AHF) in patients with stable coronary artery disease (CAD) undergoing noncardiac emergency surgery and decreases the incidence of major adverse cardiac events (MACE) during hospitalization and (b) elucidate its possible mechanism of action. In total, 500 patients with stable CAD before noncardiac emergency surgery were randomized either to the atorvastatin reload or to the placebo group. All patients received atorvastatin treatment postoperatively. The primary end point was the incidence of AHF during hospitalization, and the secondary end point was the incidence of MACE during hospitalization. Preoperative and 72 h postoperative changes in high-sensitivity C-reactive protein and interleukin-6 levels were compared between the two groups. AHF during hospitalization occurred in 5.2% of patients in the atorvastatin reload group and 11.2% in the placebo group (P=0.0225). MACE during hospitalization occurred in 2.4% of patients in the atorvastatin reload group and 8.0% in the placebo group (P=0.0088). According to multivariable analysis, atorvastatin reload conferred a 50% reduction in the risk of AHF during hospitalization (odds ratio, 0.50; 95% confidence interval, 0.2-0.8; P=0.005). The median decrease in the high-sensitivity C-reactive protein and interleukin-6 levels was significantly greater in the atorvastatin reload group (P<0.001). Atorvastatin reload may improve the clinical outcome of patients with stable CAD undergoing noncardiac emergency surgery by decreasing the incidence of AHF and MACE during hospitalization. The mechanism of this protective effect may involve inhibition of inflammation.

  13. Conceptual Chemical Process Design for Sustainability. ...

    EPA Pesticide Factsheets

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews

  14. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  15. Postseismic Reloading: A Mechanism for Temporal Clustering of Major Earthquakes on Individual Faults

    NASA Astrophysics Data System (ADS)

    Kenner, S. J.; Simons, M.

    2001-12-01

    On a single fault segment, geologic and paleoseismic evidence from locations such as the Basin and Range [Friedrich et al. JGR, submitted] and Dead Sea Transform [Marco et al., JGR, 1996] indicate that occurrence of major earthquakes in time is often extremely heterogeneous and may, in fact, exhibit temporal clustering. We consider major earthquake clustering as the occurrence of multiple event sequences with intra-cluster inter-event times much shorter than the average time between clusters. Many factors may contribute to temporal clustering of major earthquakes. Over multiple event time scales, time-dependent postseismic stress transfer may play an important role. After major earthquakes, time-varying deformation transients occur. These transients result from diffusion of stress away from zones of stress concentration generated during the coseismic rupture. As a consequence, the coseismic fault is reloaded at a rate that is initially much higher than the background rate derived from far-field plate motions. On a given fault, earthquake recurrence intervals are moderated by various sources of system noise, including stress perturbations due to neighboring earthquakes, crustal heterogeneity, and fault evolution. Depending on the relative timing and magnitude of earthquakes in a sequence, therefore, the postseismic stress available for transfer to the coseismic fault may be greater or less than average. This may lead to a situation in which postseismic stress transfer becomes a significant factor in controlling the time to the next event. To investigate these longer-term postseismic processes, we develop a spring-dashpost-slider model of time-dependent stress transfer in the earth. With this tool, we gain an understanding of how variations in rheology, fault slip-rate, and system noise affect a fault's behavior. In tectonic environments with a weak lower crust/upper mantle, we find that small random variations in the fault failure criteria generate temporally

  16. Data processing boards design for CBM experiment

    NASA Astrophysics Data System (ADS)

    Zabołotny, Wojciech M.; Kasprowicz, Grzegorz

    2014-11-01

    This paper presents a concept of the Data Processing Boards for the Compressed Baryonic Matter (CBM) experiment. Described is the evolution of the concepts leading from the functional requirements of the control and readout systems of the CBM experiment to the design of prototype implementation of the DPB boards. The paper describes requirements on the board level and on the crate level. Finally it discusses the prototype design prepared for testing and verification of proposed solutions, and selection of the final implementation.

  17. Automation of the aircraft design process

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1974-01-01

    The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.

  18. Conceptual design of industrial process displays.

    PubMed

    Pedersen, C R; Lind, M

    1999-11-01

    Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is

  19. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  20. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  1. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product design…

  2. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  3. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  4. Molecular thermodynamics for chemical process design.

    PubMed

    Prausnitz, J M

    1979-08-24

    Chemical process design requires quantitative information on the equilibrium properties of a variety of fluid mixtures. Since the experimental effort needed to provide this information is often prohibitive in cost and time, chemical engineers must utilize rational estimation techniques based on limited experimental data. The basis for such techniques is molecular thermodynamics, a synthesis of classical and statistical thermodynamics, molecular physics, and physical chemistry.

  5. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product design…

  6. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  7. Designing Instruction That Supports Cognitive Learning Processes

    PubMed Central

    Clark, Ruth; Harrelson, Gary L.

    2002-01-01

    Objective: To provide an overview of current cognitive learning processes, including a summary of research that supports the use of specific instructional methods to foster those processes. We have developed examples in athletic training education to help illustrate these methods where appropriate. Data Sources: Sources used to compile this information included knowledge base and oral and didactic presentations. Data Synthesis: Research in educational psychology within the past 15 years has provided many principles for designing instruction that mediates the cognitive processes of learning. These include attention, management of cognitive load, rehearsal in working memory, and retrieval of new knowledge from long-term memory. By organizing instruction in the context of tasks performed by athletic trainers, transfer of learning and learner motivation are enhanced. Conclusions/Recommendations: Scientific evidence supports instructional methods that can be incorporated into lesson design and improve learning by managing cognitive load in working memory, stimulating encoding into long-term memory, and supporting transfer of learning. PMID:12937537

  8. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  9. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  10. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  11. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  12. The design of a nanolithographic process

    NASA Astrophysics Data System (ADS)

    Johannes, Matthew Steven

    This research delineates the design of a nanolithographic process for nanometer scale surface patterning. The process involves the combination of serial atomic force microscope (AFM) based nanolithography with the parallel patterning capabilities of soft lithography. The union of these two techniques provides for a unique approach to nanoscale patterning that establishes a research knowledge base and tools for future research and prototyping. To successfully design this process a number of separate research investigations were undertaken. A custom 3-axis AFM with feedback control on three positioning axes of nanometer precision was designed in order to execute nanolithographic research. This AFM system integrates a computer aided design/computer aided manufacturing (CAD/CAM) environment to allow for the direct synthesis of nanostructures and patterns using a virtual design interface. This AFM instrument was leveraged primarily to study anodization nanolithography (ANL), a nanoscale patterning technique used to generate local surface oxide layers on metals and semiconductors. Defining research focused on the automated generation of complex oxide nanoscale patterns as directed by CAD/CAM design as well as the implementation of tip-sample current feedback control during ANL to increase oxide uniformity. Concurrently, research was conducted concerning soft lithography, primarily in microcontact printing (muCP), and pertinent experimental and analytic techniques and procedures were investigated. Due to the masking abilities of the resulting oxide patterns from ANL, the results of AFM based patterning experiments are coupled with micromachining techniques to create higher aspect ratio structures at the nanoscale. These relief structures are used as master pattern molds for polymeric stamp formation to reproduce the original in a parallel fashion using muCP stamp formation and patterning. This new method of master fabrication provides for a useful alternative to

  13. A reload and startup plan for conversion of the NIST research reactor

    SciTech Connect

    D. J. Diamond

    2016-03-31

    The National Institute of Standards and Technology operates a 20 MW research reactor for neutron-based research. The heavy-water moderated and cooled reactor is fueled with high-enriched uranium (HEU) but a program to convert the reactor to low-enriched uranium (LEU) fuel is underway. Among other requirements, a reload and startup test plan must be submitted to the U.S. Nuclear Regulatory Commission (NRC) for their approval. The NRC provides guidance for what should be in the plan to ensure that the licensee has sufficient information to operate the reactor safely. Hence, a plan has been generated consisting of two parts. The reload portion of the plan specifies the fuel management whereby initially only two LEU fuel elements are in the core for eight fuel cycles. This is repeated until a point when the optimum approach is to place four fresh LEU elements into the reactor each cycle. This final transition is repeated and after eight cycles the reactor is completely fueled with LEU. By only adding two LEU fuel elements initially, the plan allows for the consumption of HEU fuel elements that are expected to be in storage at the time of conversion and provides additional qualification of production LEU fuel under actual operating conditions. Because the reload is to take place over many fuel cycles, startup tests will be done at different stages of the conversion. The tests, to be compared with calculations to show that the reactor will operate as planned, are the measurement of critical shim arm position and shim arm and regulating rod reactivity worths. An acceptance criterion for each test is specified based on technical specifications that relate to safe operation. Additional tests are being considered that have less safety significance but may be of interest to bolster the validation of analysis tools.

  14. BWR Reload Strategy Based on Fixing Once-Burnt Fuel Between Cycles

    SciTech Connect

    Maag, Elizebeth M.; Knott, Dave

    2001-12-15

    The feasibility of a reload strategy based on fixing the locations of once-burnt fuel between cycles has been evaluated for the Perry nuclear power plant (Perry). This strategy can reduce refueling shuffle critical path time by 3 days without penalty in fuel cycle economics. The scheme works well for Perry because of the extreme cycle energy requirements and the large feed batch size needed to meet those requirements. Cores requiring less energy and a smaller feed batch size have not been investigated.

  15. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  16. A survey of the Oyster Creek reload licensing model

    SciTech Connect

    Alammar, M.A. )

    1991-01-01

    The Oyster Creek RETRAN licensing model was submitted for approval by the U.S. Nuclear Regulatory Commission in September 1987. This paper discusses the technical issues and concerns that were raised during the review process and how they were resolved. The technical issues are grouped into three major categories: the adequacy of the model benchmark against plant data; uncertainty analysis and model convergence with respect to various critical parameters (code correlations, nodalization, time step, etc.); and model application and usage.

  17. Mimicry of natural material designs and processes

    NASA Astrophysics Data System (ADS)

    Bond, G. M.; Richman, R. H.; McNaughton, W. P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  18. Mimicry of natural material designs and processes

    SciTech Connect

    Bond, G.M.; Richman, R.H.; McNaughton, W.P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  19. The Processes Involved in Designing Software.

    DTIC Science & Technology

    1980-08-01

    body of relevant knowledge. There has been a limited amount of research on the process of design or on problems that are difficult enough to require the...refinement of those subproblems. Our results are therefore potentially limited to similar straightforward problems. In tasks for which the...They first break the problem Into Its major constituents, thus forming a solution moodl . During each Iteration, subproblems from the previous cycle are

  20. Fuel management and reloading optimization at EdF

    SciTech Connect

    Rosset, F.D.; Barral, J.C. )

    1993-01-01

    Technical and economical pressurized water reactor (PWR) performances are strongly influenced by fuel management, e.g., fuel utilization and core design. Because of the large number of standardized French PWR units, this question is of considerable importance for Electricite de France (EdF). At present, EdF operates two standardized types of PWR: thirty-four 900-MW and twenty 1300-MW PWRS. Economic optimization will lead to global management of the nuclear power plants in the coming years through three main types of fuel management: four-batch 3.7% UO[sub 2] management and plutonium recycling management for the 900-MW PWRs and extended cycle management for the 1300-MW PWRS. The best optimization is made on each reactor by computing a loading pattern that flattens the power map to ensure a certain flexibility of operation (early shutdown, stretch-out).

  1. Thinking and the Design Process. DIUL-RR-8414.

    ERIC Educational Resources Information Center

    Moulin, Bernard

    Designed to focus attention on the design process in such computer science activities as information systems design, database design, and expert systems design, this paper examines three main phases of the design process: understanding the context of the problem, identifying the problem, and finding a solution. The processes that these phases…

  2. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  3. ABCMdb reloaded: updates on mutations in ATP binding cassette proteins.

    PubMed

    Tordai, Hedvig; Jakab, Kristóf; Gyimesi, Gergely; András, Kinga; Brózik, Anna; Sarkadi, Balázs; Hegedus, Tamás

    2017-01-01

    ABC (ATP-Binding Cassette) proteins with altered function are responsible for numerous human diseases. To aid the selection of positions and amino acids for ABC structure/function studies we have generated a database, ABCMdb (Gyimesi et al. , ABCMdb: a database for the comparative analysis of protein mutations in ABC transporters, and a potential framework for a general application. Hum Mutat 2012; 33:1547-1556.), with interactive tools. The database has been populated with mentions of mutations extracted from full text papers, alignments and structural models. In the new version of the database we aimed to collect the effect of mutations from databases including ClinVar. Because of the low number of available data, even in the case of the widely studied disease-causing ABC proteins, we also included the possible effects of mutations based on SNAP2 and PROVEAN predictions. To aid the interpretation of variations in non-coding regions, the database was supplemented with related DNA level information. Our results emphasize the importance of in silico predictions because of the sparse information available on variants and suggest that mutations at analogous positions in homologous ABC proteins have a strong predictive power for the effects of mutations. Our improved ABCMdb advances the design of both experimental studies and meta-analyses in order to understand drug interactions of ABC proteins and the effects of mutations on functional expression.

  4. A century of ice retreat on Kilimanjaro: the mapping reloaded

    NASA Astrophysics Data System (ADS)

    Cullen, N. J.; Sirguey, P.; Mölg, T.; Kaser, G.; Winkler, M.; Fitzsimons, S. J.

    2013-03-01

    A new and consistent time series of glacier retreat on Kilimanjaro over the last century has been established by re-interpreting two historical maps and processing nine satellite images, which removes uncertainty about the location and extent of past and present ice bodies. Three-dimensional visualization techniques were used in conjunction with aerial and ground-based photography to facilitate the interpretation of ice boundaries over eight epochs between 1912 and 2011. The glaciers have retreated from their former extent of 11.40 km2 in 1912 to 1.76 km2 in 2011, which represents a total loss of about 85% of the ice cover over the last 100 yr. The total loss of ice cover is in broad agreement with previous estimates, but to further characterize the spatial and temporal variability of glacier retreat a cluster analysis using topographical information (elevation, slope and aspect) was performed to segment the ice cover as observed in 1912, which resulted in three glacier zones being identified. Linear extrapolation of the retreat in each of the three identified glacier assemblages implies the ice cover on the western slopes of Kilimanjaro will be gone before 2020, while the remaining ice bodies on the plateau and southern slopes will most likely disappear by 2040. It is highly unlikely that any body of ice will be present on Kilimanjaro after 2060 if present-day climatological conditions are maintained. Importantly, the geo-statistical approach developed in this study provides us with an additional tool to characterize the physical processes governing glacier retreat on Kilimanjaro. It remains clear that, to use glacier response to unravel past climatic conditions on Kilimanjaro, the transition from growth to decay of the plateau glaciers must be further resolved, in particular the mechanisms responsible for vertical cliff development.

  5. A century of ice retreat on Kilimanjaro: the mapping reloaded

    NASA Astrophysics Data System (ADS)

    Cullen, N. J.; Sirguey, P.; Mölg, T.; Kaser, G.; Winkler, M.; Fitzsimons, S. J.

    2012-10-01

    A new and consistent time series of glacier retreat on Kilimanjaro over the last century has been established by re-interpreting two historical maps and processing nine satellite images, which removes uncertainty about the location and extent of past and present ice bodies. Three-dimensional visualization techniques were used in conjunction with aerial and ground-based photography to facilitate the interpretation of ice boundaries over eight epochs between 1912 and 2011. The glaciers have retreated from their former extent of 11.40 km2 in 1912 to 1.76 km2 in 2011, which represents a total loss of about 85% of the ice cover over the last 100 yr. The total loss of ice cover is in broad agreement with previous estimates but to further characterize the spatial and temporal variability of glacier retreat a cluster analysis using topographical information (elevation, slope and aspect) was performed to segment the ice cover as observed in 1912, which resulted in three glacier zones being identified. Linear extrapolation of the retreat in each of the three identified glacier assemblages imply the ice cover on the western slopes of Kilimanjaro will be gone before 2020, while the remaining ice bodies on the plateau and southern slopes will most likely disappear by 2040. It is highly unlikely that any body of ice will be present on Kilimanjaro after 2060 if present-day climatological conditions are maintained. Importantly, the geo-statistical approach developed in this study provides us with an additional tool to characterize the physical processes governing glacier retreat on Kilimanjaro. It remains clear that to use glacier response to unravel past climatic conditions on Kilimanjaro the transition from growth to decay of the plateau glaciers must be further resolved, in particular the mechanisms responsible for vertical cliff development.

  6. Proxima Centauri reloaded: Unravelling the stellar noise in radial velocities

    NASA Astrophysics Data System (ADS)

    Damasso, M.; Del Sordo, F.

    2017-03-01

    Context. The detection and characterisation of Earth-like planets with Doppler signals of the order of 1 m s-1 currently represent one of the greatest challenge for extrasolar-planet hunters. As results for such findings are often controversial, it is desirable to provide independent confirmations of the discoveries. Testing different models for the suppression of non-Keplerian stellar signals usually plaguing radial velocity data is essential to ensuring findings are robust and reproducible. Aims: Using an alternative treatment of the stellar noise to that discussed in the discovery paper, we re-analyse the radial velocity dataset that led to the detection of a candidate terrestrial planet orbiting the star Proxima Centauri. We aim to confirm the existence of this outstanding planet, and test the existence of a second planetary signal. Methods: Our technique jointly modelled Keplerian signals and residual correlated signals in radial velocities using Gaussian processes. We analysed only radial velocity measurements without including other ancillary data in the fitting procedure. In a second step, we have compared our outputs with results coming from photometry, to provide a consistent physical interpretation. Our analysis was performed in a Bayesian framework to quantify the robustness of our findings. Results: We show that the correlated noise can be successfully modelled as a Gaussian process regression, and contains a periodic term modulated on the stellar rotation period and characterised by an evolutionary timescale of the order of one year. Both findings appear to be robust when compared with results obtained from archival photometry, thus providing a reliable description of the noise properties. We confirm the existence of a coherent signal described by a Keplerian orbit equation that can be attributed to the planet Proxima b, and provide an independent estimate of the planetary parameters. Our Bayesian analysis dismisses the existence of a second planetary

  7. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  8. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  9. Moral judgment reloaded: a moral dilemma validation study.

    PubMed

    Christensen, Julia F; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set.

  10. Wiener filter reloaded: fast signal reconstruction without preconditioning

    NASA Astrophysics Data System (ADS)

    Kodi Ramanah, Doogesh; Lavaux, Guilhem; Wandelt, Benjamin D.

    2017-06-01

    We present a high-performance solution to the Wiener filtering problem via a formulation that is dual to the recently developed messenger technique. This new dual messenger algorithm, like its predecessor, efficiently calculates the Wiener filter solution of large and complex data sets without preconditioning and can account for inhomogeneous noise distributions and arbitrary mask geometries. We demonstrate the capabilities of this scheme in signal reconstruction by applying it on a simulated cosmic microwave background temperature data set. The performance of this new method is compared to that of the standard messenger algorithm and the preconditioned conjugate gradient (PCG) approach, using a series of well-known convergence diagnostics and their processing times, for the particular problem under consideration. This variant of the messenger algorithm matches the performance of the PCG method in terms of the effectiveness of reconstruction of the input angular power spectrum and converges smoothly to the final solution. The dual messenger algorithm outperforms the standard messenger and PCG methods in terms of execution time, as it runs to completion around two and three to four times faster than the respective methods, for the specific problem considered.

  11. Moral judgment reloaded: a moral dilemma validation study

    PubMed Central

    Christensen, Julia F.; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K.; Gomila, Antoni

    2014-01-01

    We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set. PMID:25071621

  12. SETI reloaded: Next generation radio telescopes, transients and cognitive computing

    NASA Astrophysics Data System (ADS)

    Garrett, Michael A.

    2015-08-01

    The Search for Extra-terrestrial Intelligence (SETI) using radio telescopes is an area of research that is now more than 50 years old. Thus far, both targeted and wide-area surveys have yet to detect artificial signals from intelligent civilisations. In this paper, I argue that the incidence of co-existing intelligent and communicating civilisations is probably small in the Milky Way. While this makes successful SETI searches a very difficult pursuit indeed, the huge impact of even a single detection requires us to continue the search. A substantial increase in the overall performance of radio telescopes (and in particular future wide-field instruments such as the Square Kilometre Array - SKA), provide renewed optimism in the field. Evidence for this is already to be seen in the success of SETI researchers in acquiring observations on some of the world's most sensitive radio telescope facilities via open, peer-reviewed processes. The increasing interest in the dynamic radio sky, and our ability to detect new and rapid transient phenomena such as Fast Radio Bursts (FRB) is also greatly encouraging. While the nature of FRBs is not yet fully understood, I argue they are unlikely to be the signature of distant extra-terrestrial civilisations. As astronomers face a data avalanche on all sides, advances made in related areas such as advanced Big Data analytics, and cognitive computing are crucial to enable serendipitous discoveries to be made. In any case, as the era of the SKA fast approaches, the prospects of a SETI detection have never been better.

  13. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  14. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  15. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  16. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  17. Innovative machine designs for radiation processing

    NASA Astrophysics Data System (ADS)

    Vroom, David

    2007-12-01

    In the 1990s Raychem Corporation established a program to investigate the commercialization of several promising applications involving the combined use of its core competencies in materials science, radiation chemistry and e-beam radiation technology. The applications investigated included those that would extend Raychem's well known heat recoverable polymer and wire and cable product lines as well as new potential applications such as remediation of contaminated aqueous streams. A central part of the program was the development of new accelerator technology designed to improve quality, lower processing costs and efficiently process conformable materials such at liquids. A major emphasis with this new irradiation technology was to look at the accelerator and product handling systems as one integrated, not as two complimentary systems.

  18. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  19. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  20. Detrimental effects of reloading recovery on force, shortening velocity, and power of soleus muscles from hindlimb-unloaded rats.

    PubMed

    Widrick, J J; Maddalozzo, G F; Hu, H; Herron, J C; Iwaniec, U T; Turner, R T

    2008-11-01

    To better understand how atrophied muscles recover from prolonged nonweight-bearing, we studied soleus muscles (in vitro at optimal length) from female rats subjected to normal weight bearing (WB), 15 days of hindlimb unloading (HU), or 15 days HU followed by 9 days of weight bearing reloading (HU-R). HU reduced peak tetanic force (P(o)), increased maximal shortening velocity (V(max)), and lowered peak power/muscle volume. Nine days of reloading failed to improve P(o), while depressing V(max) and intrinsic power below WB levels. These functional changes appeared intracellular in origin as HU-induced reductions in soleus mass, fiber cross-sectional area, and physiological cross-sectional area were partially or completely restored by reloading. We calculated that HU-induced reductions in soleus fiber length were of sufficient magnitude to overextend sarcomeres onto the descending limb of their length-tension relationship upon the resumption of WB activity. In conclusion, the force, shortening velocity, and power deficits observed after 9 days of reloading are consistent with contraction-induced damage to the soleus. HU-induced reductions in fiber length indicate that sarcomere hyperextension upon the resumption of weight-bearing activity may be an important mechanism underlying this response.

  1. Rolling Reloaded

    ERIC Educational Resources Information Center

    Jones, Simon A.; Nieminen, John M.

    2008-01-01

    Not so long ago a new observation about rolling motion was described: for a rolling wheel, there is a set of points with instantaneous velocities directed at or away from the centre of the wheel; these points form a circle whose diameter connects the centre of the wheel to the wheel's point of contact with the ground (Sharma 1996 "Eur. J. Phys."…

  2. Jemboss reloaded.

    PubMed

    Mullan, Lisa

    2004-06-01

    Bioinformatics tools are freely available from websites all over the world. Often they are presented as web services, although there are many tools for download and use on a local machine. This tutorial section looks at Jemboss, a Java-based graphical user interface (GUI) for the EMBOSS bioinformatics suite, which combines the advantages of both web service and downloaded software.

  3. KPZ Reloaded

    NASA Astrophysics Data System (ADS)

    Gubinelli, Massimiliano; Perkowski, Nicolas

    2017-01-01

    We analyze the one-dimensional periodic Kardar-Parisi-Zhang equation in the language of paracontrolled distributions, giving an alternative viewpoint on the seminal results of Hairer. Apart from deriving a basic existence and uniqueness result for paracontrolled solutions to the KPZ equation we perform a thorough study of some related problems. We rigorously prove the links between the KPZ equation, stochastic Burgers equation, and (linear) stochastic heat equation and also the existence of solutions starting from quite irregular initial conditions. We also show that there is a natural approximation scheme for the nonlinearity in the stochastic Burgers equation. Interpreting the KPZ equation as the value function of an optimal control problem, we give a pathwise proof for the global existence of solutions and thus for the strict positivity of solutions to the stochastic heat equation. Moreover, we study Sasamoto-Spohn type discretizations of the stochastic Burgers equation and show that their limit solves the continuous Burgers equation possibly with an additional linear transport term. As an application, we give a proof of the invariance of the white noise for the stochastic Burgers equation that does not rely on the Cole-Hopf transform.

  4. Design and processing of organic electroluminescent devices

    NASA Astrophysics Data System (ADS)

    Pardo-Guzman, Dino Alejandro

    2000-11-01

    The present dissertation compiles three aspects of my Ph.D. work on OLED device design, fabrication and characterization. The first chapter is a review of the concepts and theories describing the mechanisms of organic electroluminescence. The second chapter makes use of these concepts to articulate some basic principles for the design of efficient and stable OLEDs. The third chapter describes the main characterization and sample preparation techniques used along this dissertation. Chapter IV describes the processing of efficient organic electroluminescent EL devices with ITO/TPD/AIQ3/Mg:Ag structures. The screen printing technique of a hole transport polymeric blend was used in an unusual mode to render thin films in the order of 60-80 nm. EL devices were then fabricated on top of these sp films to provide ~0.9% quantum efficiencies, comparable to spin coating with the same structures. Various polymer:TPD and solvent combinations were studied to find the paste with the best rheological properties. The same technique was also used to deposit a patterned MEH-PPV film. Chapter V describes my research work on the wetting of TPD on ITO substrates. The wetting was monitored by following its surface morphology evolution as a function of temperature. The effect of these surface changes was then correlated to the I-V-L characteristics of devices made with these TPD films. The surface roughness was measured with tapping AFM showed island formation at temperatures as low as 50-60°C. I Also investigated the effect of the purity of materials like AlQ3 on the device EL performance, as described in Chapter VI. In order to improve the purity of these environmentally degradable complexes a new in situ purification technique was developed with excellent enhancement of the EL cell properties. The in situ purification process was then used to purify/deposit organic dyes with improved film formation and EL characteristics.

  5. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  6. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  7. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  8. Short- and Long-Term Hindlimb Immobilization and Reloading: Profile of Epigenetic Events in Gastrocnemius.

    PubMed

    Chacon-Cabrera, Alba; Gea, Joaquim; Barreiro, Esther

    2017-06-01

    Skeletal muscle dysfunction and atrophy are characteristic features accompanying chronic conditions. Epigenetic events regulate muscle mass and function maintenance. We hypothesized that the pattern of epigenetic events (muscle-enriched microRNAs and histone acetylation) and acetylation of transcription factors known to signal muscle wasting may differ between early- and late-time points in skeletal muscles of mice exposed to hindlimb immobilization (I) and recovery following I. Body and muscle weights, grip strength, muscle-enriched microRNAs, histone deacetylases (HDACs), acetylation of proteins, histones, and transcription factors (TF), myogenic TF factors, and muscle phenotype were assessed in gastrocnemius of mice exposed to periods (1, 2, 3, 7, 15, and 30 days, I groups) of hindlimb immobilization, and in those exposed to reloading for different periods of time (1, 3, 7, 15, and 30 days, R groups) following 7-day immobilization. Compared to non-immobilized controls, muscle weight, limb strength, microRNAs, especially miR-486, SIRT1 levels, and slow- and fast-twitch cross-sectional areas were decreased in mice of I groups, whereas Pax7 and acetylated FoxO1 and FoxO3 levels were increased. Muscle reloading following splint removal improved muscle mass loss, strength, and fiber atrophy, by increasing microRNAs, particularly miR-486, and SIRT1 content, while decreasing acetylated FoxO1 and FoxO3 levels. In this mouse model of disuse muscle atrophy, muscle-enriched microRNAs, especially miR-486, through Pax7 regulation delayed muscle cell differentiation following unloading of gastrocnemius muscle. Acetylation of FoxO1 and 3 seemed to drive muscle mass loss and atrophy, while deacetylation of these factors through SIRT1 would enable the muscle fibers to regenerate. J. Cell. Physiol. 232: 1415-1427, 2017. © 2016 Wiley Periodicals, Inc.

  9. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is…

  10. Conceptual design of clean processes: Tools and methods

    SciTech Connect

    Hurme, M.

    1996-12-31

    Design tools available for implementing clean design into practice are discussed. The application areas together with the methods of comparison of clean process alternatives are presented. Environmental principles are becoming increasingly important in the whole life cycle of products from design, manufacturing and marketing to disposal. The hinder of implementing clean technology in design has been the necessity to apply it in all phases of design starting from the beginning, since it deals with the major selections made in the conceptual process design. Therefore both a modified design approach and new tools are needed for process design to make the application of clean technology practical. The first item; extended process design methodologies has been presented by Hurme, Douglas, Rossiter and Klee, Hilaly and Sikdar. The aim of this paper is to discuss the latter topic; the process design tools which assist in implementing clean principles into process design. 22 refs., 2 tabs.

  11. Mechanical Design Support System Based on Thinking Process Development Diagram

    NASA Astrophysics Data System (ADS)

    Mase, Hisao; Kinukawa, Hiroshi; Morii, Hiroshi; Nakao, Masayuki; Hatamura, Yotaro

    This paper describes a system that directly supports a design process in a mechanical domain. This system is based on a thinking process development diagram that draws distinctions between requirement, tasks, solutions, and implementation, which enables designers to expand and deepen their thoughts of design. The system provides five main functions that designers require in each phase of the proposed design process: (1) thinking process description support which enables designers to describe their thoughts, (2) creativity support by term association with thesauri, (3) timely display of design knowledge including know-how obtained through earlier failures, general design theories, standard-parts data, and past designs, (4) design problem solving support using 46 kinds of thinking operations, and (5) proper technology transfer support which accumulates not only design conclusions but also the design process. Though this system is applied to mechanical engineering as the first target domain, it can be easily expanded to many other domains such as architecture and electricity.

  12. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  13. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  14. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  15. POLLUTION PREVENTION IN THE EARLY STAGES OF HIERARCHICAL PROCESS DESIGN

    EPA Science Inventory

    Hierarchical methods are often used in the conceptual stages of process design to synthesize and evaluate process alternatives. In this work, the methods of hierarchical process design will be focused on environmental aspects. In particular, the design methods will be coupled to ...

  16. POLLUTION PREVENTION IN THE EARLY STAGES OF HIERARCHICAL PROCESS DESIGN

    EPA Science Inventory

    Hierarchical methods are often used in the conceptual stages of process design to synthesize and evaluate process alternatives. In this work, the methods of hierarchical process design will be focused on environmental aspects. In particular, the design methods will be coupled to ...

  17. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  18. Process of system design and analysis

    SciTech Connect

    Gardner, B.

    1995-09-01

    The design of an effective physical protection system includes the determination of the physical protection system objectives, the initial design of a physical protection system, the evaluation of the design, and, probably, a redesign or refinement of the system. To develop the objectives, the designer must begin by gathering information about facility operations and conditions, such as a comprehensive description of the facility, operating states, and the physical protection requirements. The designer then needs to define the threat. This involves considering factors about potential adversaries: Class of adversary, adversary`s capabilities, and range of adversary`s tactics. Next, the designer should identify targets. Determination of whether or not nuclear materials are attractive targets is based mainly on the ease or difficulty of acquisition and desirability of the materiaL The designer now knows the objectives of the physical protection system, that is, ``What to protect against whom.`` The next step is to design the system by determining how best to combine such elements as fences, vaults, sensors, procedures, communication devices, and protective force personnel to meet the objectives of the system. Once a physical protection system is designed, it must be analyzed and evaluated to ensure it meets the physical protection objectives. Evaluation must allow for features working together to assure protection rather than regarding each feature separately. Due to the complexity of protection systems, an evaluation usually requires modeling techniques. If any vulnerabilities are found, the initial system must be redesigned to correct the vulnerabilities and a reevaluation conducted.

  19. Hynol Process Engineering: Process Configuration, Site Plan, and Equipment Design

    DTIC Science & Technology

    1996-02-01

    wood, and natural gas is used as a co-feed stock. Compared with other methanol production processes, direct emissions of carbon dioxide can be...co-feedstock. Compared with other methanol production processes, direct emissions of carbon dioxide (CO 2) can be substantially reduced by using the...gas provides for reduced CO2 emissions per unit of fossil fuel carbon processed compared with separate natural gas and biomass processes. In accordance

  20. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats.

    PubMed

    Riley, D A

    1998-01-01

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  1. Review of primary spaceflight-induced and secondary reloading-induced changes in slow antigravity muscles of rats

    NASA Astrophysics Data System (ADS)

    Riley, D. A.

    We have examined the light and electron microscopic properties of hindlimb muscles of rats flown in space for 1-2 weeks on Cosmos biosatellite flights 1887 and 2044 and Space Shuttle missions Spacelab-3, Spacelab Life Sciences-1 and Spacelab Life Sciences-2. Tissues were obtained both inflight and postflight permitting definition of primary microgravity-induced changes and secondary reentry and gravity reloading-induced alterations. Spaceflight causes atrophy and expression of fast fiber characteristics in slow antigravity muscles. The stresses of reentry and reloading reveal that atrophic muscles show increased susceptibility to interstitial edema and ischemic-anoxic necrosis as well as muscle fiber tearing with disruption of contractile proteins. These results demonstrate that the effects of spaceflight on skeletal muscle are multifaceted, and major changes occur both inflight and following return to Earth's gravity.

  2. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  3. Space bioreactor: Design/process flow

    NASA Technical Reports Server (NTRS)

    Cross, John H.

    1987-01-01

    The design of the space bioreactor stems from three considerations. First, and foremost, it must sustain cells in microgravity. Closely related is the ability to take advantage of the weightlessness and microgravity. Lastly, it should fit into a bioprocess. The design of the space bioreactor is described in view of these considerations. A flow chart of the bioreactor is presented and discussed.

  4. Rapid Prototyping in the Instructional Design Process.

    ERIC Educational Resources Information Center

    Nixon, Elizabeth Krick; Lee, Doris

    2001-01-01

    Discusses instructional design models and examines rapid prototyping, a model that combines computer design strategies, constructivist learning theory, and cognitive psychology. Highlights include limitations of linear models; instructional problems appropriate and those not appropriate for rapid prototyping; and rapid prototyping as a paradigm…

  5. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... substantially affected by a natural disaster in a designated disaster county. Disaster designations have been... would be considered a disaster area. This rule also revises the definition of ``natural disaster'' to be... conditions from the definition of ``natural disaster'' could lead to potential program abuse and fraud. It...

  6. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  7. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  8. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  9. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  10. Wireless Participant Incentives Using Reloadable Bank Cards to Increase Clinical Trial Retention With Abused Women Drinkers: A Natural Experiment.

    PubMed

    Rodgers, Melissa; Meisel, Zachary; Wiebe, Douglas; Crits-Christoph, Paul; Rhodes, Karin V

    2016-08-07

    Retaining participants in longitudinal studies is a unique methodological challenge in many areas of investigation, and specifically for researchers aiming to identify effective interventions for women experiencing intimate partner violence (IPV). Individuals in abusive relationships are often transient and have logistical, confidentiality, and safety concerns that limit future contact. A natural experiment occurred during a large randomized clinical trial enrolling women in abusive relationships who were also heavy drinkers, which allowed for the comparison of two incentive methods to promote longitudinal retention: cash payment versus reloadable wireless bank cards. In all, 600 patients were enrolled in the overall trial, which aimed to incentivize participants using a reloadable bank card system to promote the completion of 11 weekly interactive voice response system (IVRS) phone surveys and 3-, 6-, and 12-month follow-up phone or in person interviews. The first 145 participants were paid with cash as a result of logistical delays in setting up the bank card system. At 12 weeks, participants receiving the bank card incentive completed significantly more IVRS phone surveys, odds ratio (OR) = 2.4, 95% confidence interval (CI) = [0.01, 1.69]. There were no significant differences between the two groups related to satisfaction or safety and/or privacy. The bank card system delivered lower administrative burden for tracking payments for study staff. Based on these and other results, our large medical research university is implementing reloadable bank card as the preferred method of participant incentive payments. © The Author(s) 2016.

  11. Application of Process Modeling Tools to Ship Design

    DTIC Science & Technology

    2011-05-01

    NAVSEA Frank Waldman; LATTIX May 2011 APPLICATION OF PROCESS MODELING TOOLS TO SHIP DESIGN Report Documentation Page Form ApprovedOMB No. 0704-0188...00-00-2011 4. TITLE AND SUBTITLE Application of Process Modeling Tools to Ship Design 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...design teams – Long design schedules – Complicated acquisition procedures • We are applying commercial process modeling techniques for: – Better

  12. Adaptation of the proximal femur to skeletal reloading after long-duration spaceflight.

    PubMed

    Lang, Thomas F; Leblanc, Adrian D; Evans, Harlan J; Lu, Ying

    2006-08-01

    We studied the effect of re-exposure to Earth's gravity on the proximal femoral BMD and structure of astronauts 1 year after missions lasting 4-6 months. We observed that the readaptation of the proximal femur to Earth's gravity entailed an increase in bone size and an incomplete recovery of volumetric BMD. Bone loss is a well-known result of skeletal unloading in long-duration spaceflight, with the most severe losses occurring in the proximal femur. However, there is little information about the recovery of bone loss after mission completion and no information about effect of reloading on the structure of load-bearing bone. To address these questions, we carried out a study of the effect of re-exposure to Earth's gravity on the BMD and structure of the proximal femur 1 year after missions lasting 4-6 months. In 16 crew members of the International Space Station (ISS) making flights of 4.5-6 months, we used QCT imaging to measure the total, trabecular, and cortical volumetric BMD (vBMD) of the proximal femur. In addition to vBMD, we also quantified BMC, bone volume, femoral neck cross-sectional area (CSA), and femoral neck indices of compressive and bending strength at three time-points: preflight, postflight, and 1 year after mission. Proximal femoral bone mass was substantially recovered in the year after spaceflight, but measures of vBMD and estimated bone strength showed only partial recovery. The recovery of BMC, in the absence of a comparable increase in vBMD, was explained by increases in bone volume and CSA during the year after spaceflight. Adaptation of the proximal femur to reloading entailed an increase in bone size and an incomplete recovery of vBMD. The data indicate that recovery of skeletal density after long-duration space missions may exceed 1 year and supports the evidence in the aging literature for periosteal apposition as a compensatory response for bone loss. The extent to which this compensatory effect protects against fracture remains to be

  13. Integrated Language Design and Implementation Process

    DTIC Science & Technology

    2000-03-28

    and Technology PO Box 91000 Portland, OR 97291 DISTRIBUTION STATEMENT A 20000403 154 Approved for Public Release Distribution Unlimited Software Design...Experimental Software Engineering Science and Technology Sauerwiesen 6 P.O. Box 91000 D-67661 Kaiserslautern, Germany Portland, OR 97291-1000 USA widen...the language technology of the target (MSL) [28] developed as part of the Software Design for environment. Reliability and Reuse (SDRR) project [4

  14. Study on Product Innovative Design Process Driven by Ideal Solution

    NASA Astrophysics Data System (ADS)

    Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui

    Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.

  15. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  16. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  17. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  18. Knowledge and Processes in Design. DPS Final Report.

    ERIC Educational Resources Information Center

    Pirolli, Peter

    Four papers from a project concerning information-processing characterizations of the knowledge and processes involved in design are presented. The project collected and analyzed verbal protocols from instructional designers, architects, and mechanical engineers. A framework was developed for characterizing the problem spaces of design that…

  19. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  20. Glucose uptake in rat soleus - Effect of acute unloading and subsequent reloading

    NASA Technical Reports Server (NTRS)

    Henriksen, Eric J.; Tischler, Marc E.

    1988-01-01

    The effect of acutely reduced weight bearing (unloading) on the in vitro uptake of 2-1,2-H-3-deoxy-D-glucose was studied in the soleus muscle by tail casting and suspending rats. After just 4 h, the uptake of 2-deoxy-D-glucose fell (-19 percent) and declined further after an additional 20 h of unloading. This diminution at 24 h was associated with slower oxidation of C-14-glucose and incorporation of C-14-glucose into glycogen. At 3 days of unloading, basal uptake of 2-deoxy-D-glucose did not differ from control. Reloading of the soleus after 1 or 3 days of unloading increased uptake of 2-deoxy-D-glucose above control and returned it to normal within 6 h and 4 days, respectively. These effects of unloading and recovery were caused by local changes in the soleus, because the extensor digitorum longus from the same hindlimbs did not display any alterations in uptake of 2-deoxy-D-glucose or metabolism of glucose.

  1. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  2. Design Criteria for Process Wastewater Pretreatment Facilities

    DTIC Science & Technology

    1988-05-01

    Osmosis 44 18. Oil/Water Separation 45 19. Air Stripping 47 20. Chemical Reduction 47 0 D. Utilizing and Combining Waste Streams Prior to Treatment 49 E...carbon R Granular activated carbon adsorption S Reverse osmosis Table20 PRETREATMENT PROCESS REMOVAL EFFICIENCY RANGES AVERA(;E ACHIEVABLE EFFLUENT...Donovan et al. included reverse osmosis , activated carbon adsorption, biological treatment, air stripping and chemical precipitation. The process

  3. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught…

  4. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  5. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Image and Video Library

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  6. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  7. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  8. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  9. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  10. The Process of Soviet Weapons Design

    DTIC Science & Technology

    1978-03-01

    system on the BMP from an early 1940s German design. But the validity and usefulness of a theory, especially one that makes predictions about the future...when the 1940 publication of a highly significant Soviet discovery of spontaneous fission resulted in a complete lack of an American response, the...taken from I. N. Golovin , I. V. Khurchatov, Atomizdat, Moscow, 1973, and from Herbert York, The Advisors. Oppenheimer, Teller, and the Superbomb, W. H

  11. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  12. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  13. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  14. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  15. Design and Processing of Electret Structures

    DTIC Science & Technology

    2009-10-31

    AND PROCESSING OF ELECTRET STRUCTURES Guiding Colloidal Crystallization in a Galvanic Micro Reactor   Figure 4: Guided colloidal aggregation...using galvanic micro reactor arrays. Scale bars = 50 µm. (A) Schematic of the micro reactor . The patterned electrode consists of a 100 nm thick gold...of fields, from nano- and microfluidics , to cloud seeding in the atmosphere, corrosion inhibition, and heterogeneous catalysis. Interesting

  16. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  17. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  18. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  19. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  20. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  1. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  2. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  3. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  4. In-design process hotspot repair using pattern matching

    NASA Astrophysics Data System (ADS)

    Jang, Daehyun; Ha, Naya; Jeon, Junsu; Kang, Jae-Hyun; Paek, Seung Weon; Choi, Hungbok; Kim, Kee Sup; Lai, Ya-Chieh; Hurat, Philippe; Luo, Wilbur

    2012-03-01

    As patterning for advanced processes becomes more challenging, designs must become more process-aware. The conventional approach of running lithography simulation on designs to detect process hotspots is prohibitive in terms of runtime for designers, and also requires the release of highly confidential process information. Therefore, a more practical approach is required to make the In-Design process-aware methodology more affordable in terms of maintenance, confidentiality, and runtime. In this study, a pattern-based approach is chosen for Process Hotspot Repair (PHR) because it accurately captures the manufacturability challenges without releasing sensitive process information. Moreover, the pattern-based approach is fast and well integrated in the design flow. Further, this type of approach is very easy to maintain and extend. Once a new process weak pattern has been discovered (caused by Chemical Mechanical Polishing (CMP), etch, lithography, and other process steps), the pattern library can be quickly and easily updated and released to check and fix subsequent designs. This paper presents the pattern matching flow and discusses its advantages. It explains how a pattern library is created from the process weak patterns found on silicon wafers. The paper also discusses the PHR flow that fixes process hotspots in a design, specifically through the use of pattern matching and routing repair.

  5. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by

  6. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined.

  7. Channel Rehabilitation: Processes, Design, and Implementation

    DTIC Science & Technology

    1999-07-01

    the U.S. Army Corps of Engineers flood channel on the San Lorenzo River at Santa Cruz in California, 350,000 cubic meters of sediment had been...Practical Guide to Effective Discharge Calculations 290 Burkham, D. E. (1972). Channel changes of the Gila River in Safford Valley, Arizona , 1846-1970... Channel Processes 21 If we change a river we usually do some good somewhere and “good” in quotation marks. That means we achieve some kind of a result

  8. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  9. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  10. An Analysis of Algorithmic Processes and Instructional Design.

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Gerlach, Vernon S.

    1986-01-01

    Describes algorithms and shows how they can be applied to the design of instructional systems by relating them to a standard information processing model. Two studies are briefly described which tested serial and parallel processing in learning and offered guidelines for designers. Future research needs are also discussed. (LRW)

  11. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  12. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  13. Process Design Manual for Land Treatment of Municipal Wastewater.

    ERIC Educational Resources Information Center

    Crites, R.; And Others

    This manual presents a procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are given emphasis. The basic unit operations and unit processes are discussed in detail, and the design concepts and criteria are presented. The manual includes design…

  14. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  15. Fuel and Core Design Experiences in Cofrentes NPP

    SciTech Connect

    Garcia-Delgado, L.; Lopez-Carbonell, M.T.; Gomez-Bernal, I.

    2002-07-01

    The electricity market deregulation in Spain is increasing the need for innovations in nuclear power generation, which can be achieved in the fuel area by improving fuel and core designs and by introducing vendors competition. Iberdrola has developed the GIRALDA methodology for design and licensing of Cofrentes reloads, and has introduced mixed cores with fuel from different vendors. The application of GIRALDA is giving satisfactory results, and is showing its capability to adequately reproduce the core behaviour. The nuclear design team is acquiring an invaluable experience and a deep knowledge of the core, very useful to support cycle operation. Continuous improvements are expected for the future in design strategies as well as in the application of new technologies to redesign the methodology processes. (authors)

  16. Programming-Free Form Conversion, Design, and Processing

    PubMed Central

    Fan, Ting-Jun; Machlin, Rona S.; Wang, Christopher P.; Chang, Ifay F.

    1990-01-01

    In this paper, we present the requirements and design considerations for programming-free form conversion, design, and processing. A set of object-oriented software tools are also presented to help users convert a paper form into an electronic form, design an electronic form, and fill in an electronic form directly on screen.

  17. The Use of Computer Graphics in the Design Process.

    ERIC Educational Resources Information Center

    Palazzi, Maria

    This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…

  18. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  19. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  20. Innovation Design of Persimmon Processing Equipment Driven by Future Scenarios

    NASA Astrophysics Data System (ADS)

    Duan, Xiao-fei; Su, Xiu-juan; Guan, Lei; Zhang, Wei-she

    2017-07-01

    This article aims to discuss the methods of innovative by future scenarios design, to help the designers be more effective of the design of persimmon processing machinery. By analyzing the persimmon traditional processing process, conceiving persimmon processing future scenarios and using the UXD and Morphological matrix, it can get the comprehensive function schemes. It Select the most optimal schemes which match the future scenarios best by illustrating the schematic design of the rotary-light Dried-persimmon Processing Machinery. It is feasible and effective to carry out the scenario design research and construct the reasonable future scenario, and combine the function analysis method to carry on the product plan innovation and the development.

  1. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  2. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  3. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.

  4. Process-based design of dynamical biological systems

    PubMed Central

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-01-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered. PMID:27686219

  5. Process-based design of dynamical biological systems

    NASA Astrophysics Data System (ADS)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  6. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  7. Perspectives on the design of safer nanomaterials and manufacturing processes.

    PubMed

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-09-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial.

  8. Perspectives on the design of safer nanomaterials and manufacturing processes

    NASA Astrophysics Data System (ADS)

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-09-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles, which includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial.

  9. [Embedded system design of color-blind image processing].

    PubMed

    Wang, Eric; Ma, Yu; Wang, Yuanyuan

    2011-01-01

    An ARM-based embedded system design schemes is proposed for the color-blind image processing system. The hardware and software of the embedded color-blind image processing system are designed using ARM core processor. Besides, a simple and convenient interface is implemented. This system supplies a general hardware platform for the applications of color-blind image processing algorithms, so that it may bring convenience for the test and rectification of color blindness.

  10. Design and Implementation of a Multimedia DBMS: Complex Query Processing

    DTIC Science & Technology

    1991-09-01

    IMPLEMENTATION OF A MULTIMEDIA DBMS: COMPLEX QUERY PROCESSING by Huseyin Aygun September 1991 Thesis Advisor Vincent Y. Lum Approved for public release...type "trace in <function name>. 31 IV. DESIGN OF COMPLEX QUERY PROCESSING In Chapter II of this thesis the general architecture of the MDBMS...data to display. More detailed information about the modification can be found in the next chapter of this thesis . Because the design for the process

  11. Process Model Construction and Optimization Using Statistical Experimental Design,

    DTIC Science & Technology

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  12. Cognitive Design for Learning: Cognition and Emotion in the Design Process

    ERIC Educational Resources Information Center

    Hasebrook, Joachim

    2016-01-01

    We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…

  13. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  14. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  15. Ensuring competitive advantage with semantic design process management

    SciTech Connect

    Quazzani, A.; Bernard, A.; Bocquet, J.C.

    1996-12-31

    In the field of design assistance, it is important to improve records of design history and management of design process. Indeed, we propose a modelling approach of design process that focuses on representation of semantic actions. We have identified two types of actions: physical design actions focusing on the product (e.g., parameter creation, shaft dimensioning) and management actions that allow management of the process from the planning and control viewpoint (e.g., synchronization actions, a resource allocation for a task). A taxonomy of these actions has been established according to several criteria (granularity, fields of action ... ) selected in consideration of our process management interests. Linkage with objective and rationale is also discussed.

  16. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs.

  17. Nucleoprotein supplementation enhances the recovery of rat soleus mass with reloading after hindlimb unloading-induced atrophy via myonuclei accretion and increased protein synthesis.

    PubMed

    Nakanishi, Ryosuke; Hirayama, Yusuke; Tanaka, Minoru; Maeshige, Noriaki; Kondo, Hiroyo; Ishihara, Akihiko; Roy, Roland R; Fujino, Hidemi

    2016-12-01

    Hindlimb unloading results in muscle atrophy and a period of reloading has been shown to partially recover the lost muscle mass. Two of the mechanisms involved in this recovery of muscle mass are the activation of protein synthesis pathways and an increase in myonuclei number. The additional myonuclei are provided by satellite cells that are activated by the mechanical stress associated with the reloading of the muscles and eventually incorporated into the muscle fibers. Amino acid supplementation with exercise also can increase skeletal muscle mass through enhancement of protein synthesis and nucleotide supplements can promote cell cycle activity. Therefore, we hypothesized that nucleoprotein supplementation, a combination of amino acids and nucleotides, would enhance the recovery of muscle mass to a greater extent than reloading alone after a period of unloading. Adult rats were assigned to 4 groups: control, hindlimb unloaded (HU; 14 days), reloaded (5 days) after hindlimb unloading (HUR), and reloaded after hindlimb unloading with nucleoprotein supplementation (HUR + NP). Compared with the HUR group, the HUR + NP group had larger soleus muscles and fiber cross-sectional areas, higher levels of phosphorylated rpS6, and higher numbers of myonuclei and myogenin-positive cells. These results suggest that nucleoprotein supplementation has a synergistic effect with reloading in recovering skeletal muscle properties after a period of unloading via rpS6 activation and satellite cell differentiation and incorporation into the muscle fibers. Therefore, this supplement may be an effective therapeutic regimen to include in rehabilitative strategies for a variety of muscle wasting conditions such as aging, cancer cachexia, muscular dystrophy, bed rest, and cast immobilization. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Design, processing, and testing of LSI arrays for space station

    NASA Technical Reports Server (NTRS)

    Ipri, A. C.

    1976-01-01

    The applicability of a particular process for the fabrication of large scale integrated circuits is described. Test arrays were designed, built, and tested, and then utilized. A set of optimum dimensions for LSI arrays was generated. The arrays were applied to yield improvement through process innovation, and additional applications were suggested in the areas of yield prediction, yield modeling, and process reliability.

  19. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  20. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  1. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, S.D.

    1998-07-01

    The sequential manner in which materials and processes for a manufactured product are selected is inherently less than optimal. Designers` tendency to choose processes and materials with which they are familiar exacerbate this problem. A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach is presented.

  2. Dynamic Characteristics Analysis of Analogue Networks Design Process

    NASA Astrophysics Data System (ADS)

    Zemliak, Alexander M.

    The process of designing analogue circuits is formulated as a controlled dynamic system. For analysis of such system's properties it is suggested to use the concept of Lyapunov's function for a dynamic system. Various forms of Lyapunov's function are suggested. Analyzing the behavior of Lyapunov's function and its first derivative allowed us to determine significant correlation between this function's properties and processor time used to design the circuit. Numerical results prove the possibility of forecasting the behavior of various designing strategies and processor time based on the properties of Lyapunov's function for the process of designing the circuit.

  3. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  4. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  5. Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara

    Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.

  6. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  7. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  8. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  9. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  10. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  11. The New Digital Engineering Design and Graphics Process.

    ERIC Educational Resources Information Center

    Barr, R. E.; Krueger, T. J.; Aanstoos, T. A.

    2002-01-01

    Summarizes the digital engineering design process using software widely available for the educational setting. Points out that newer technology used in the field is not used in engineering graphics education. (DDR)

  12. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  13. Structure and Functional Characteristics of Rat's Left Ventricle Cardiomyocytes under Antiorthostatic Suspension of Various Duration and Subsequent Reloading

    PubMed Central

    Ogneva, I. V.; Mirzoev, T. M.; Biryukov, N. S.; Veselova, O. M.; Larina, I. M.

    2012-01-01

    The goal of the research was to identify the structural and functional characteristics of the rat's left ventricle under antiorthostatic suspension within 1, 3, 7 and 14 days, and subsequent 3 and 7-day reloading after a 14-day suspension. The transversal stiffness of the cardiomyocyte has been determined by the atomic force microscopy, cell respiration—by polarography and proteins content—by Western blotting. Stiffness of the cortical cytoskeleton increases as soon as one day after the suspension and increases up to the 14th day, and starts decreasing during reloading, reaching the control level after 7 days. The stiffness of the contractile apparatus and the intensity of cell respiration also increases. The content of non-muscle isoforms of actin in the cytoplasmic fraction of proteins does not change during the whole experiment, as does not the beta-actin content in the membrane fraction. The content of gamma-actin in the membrane fraction correlates with the change in the transversal stiffness of the cortical cytoskeleton. Increased content of alpha-actinin-1 and alpha-actinin-4 in the membrane fraction of proteins during the suspension is consistent with increased gamma-actin content there. The opposite direction of change of alpha-actinin-1 and alpha-actinin-4 content suggests their involvement into the signal pathways. PMID:23093854

  14. The Influence of Toy Design Activities on Middle School Students' Understanding of the Engineering Design Processes

    NASA Astrophysics Data System (ADS)

    Zhou, Ninger; Pereira, Nielsen L.; George, Tarun Thomas; Alperovich, Jeffrey; Booth, Joran; Chandrasegaran, Senthil; Tew, Jeffrey David; Kulkarni, Devadatta M.; Ramani, Karthik

    2017-10-01

    The societal demand for inspiring and engaging science, technology, engineering, and mathematics (STEM) students and preparing our workforce for the emerging creative economy has necessitated developing students' self-efficacy and understanding of engineering design processes from as early as elementary school levels. Hands-on engineering design activities have shown the potential to promote middle school students' self-efficacy and understanding of engineering design processes. However, traditional classrooms often lack hands-on engineering design experiences, leaving students unprepared to solve real-world design problems. In this study, we introduce the framework of a toy design workshop and investigate the influence of the workshop activities on students' understanding of and self-efficacy beliefs in engineering design. Using a mixed method approach, we conducted quantitative analyses to show changes in students' engineering design self-efficacy and qualitative analyses to identify students' understanding of the engineering design processes. Findings show that among the 24 participants, there is a significant increase in students' self-efficacy beliefs after attending the workshop. We also identified major themes such as design goals and prototyping in students' understanding of engineering design processes. This research provides insights into the key elements of middle school students' engineering design learning and the benefits of engaging middle school students in hands-on toy design workshops.

  15. The Influence of Toy Design Activities on Middle School Students' Understanding of the Engineering Design Processes

    NASA Astrophysics Data System (ADS)

    Zhou, Ninger; Pereira, Nielsen L.; George, Tarun Thomas; Alperovich, Jeffrey; Booth, Joran; Chandrasegaran, Senthil; Tew, Jeffrey David; Kulkarni, Devadatta M.; Ramani, Karthik

    2017-05-01

    The societal demand for inspiring and engaging science, technology, engineering, and mathematics (STEM) students and preparing our workforce for the emerging creative economy has necessitated developing students' self-efficacy and understanding of engineering design processes from as early as elementary school levels. Hands-on engineering design activities have shown the potential to promote middle school students' self-efficacy and understanding of engineering design processes. However, traditional classrooms often lack hands-on engineering design experiences, leaving students unprepared to solve real-world design problems. In this study, we introduce the framework of a toy design workshop and investigate the influence of the workshop activities on students' understanding of and self-efficacy beliefs in engineering design. Using a mixed method approach, we conducted quantitative analyses to show changes in students' engineering design self-efficacy and qualitative analyses to identify students' understanding of the engineering design processes. Findings show that among the 24 participants, there is a significant increase in students' self-efficacy beliefs after attending the workshop. We also identified major themes such as design goals and prototyping in students' understanding of engineering design processes. This research provides insights into the key elements of middle school students' engineering design learning and the benefits of engaging middle school students in hands-on toy design workshops.

  16. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  17. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  18. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  19. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  20. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  1. Designing School Accountability Systems: Towards a Framework and Process.

    ERIC Educational Resources Information Center

    Gong, Brian

    This document presents three different views of accountability to address state needs as their departments of education design, improve, or review their state accountability and reporting systems. The first of three sections presents the system-design decision process as a linear sequence of ten steps from defining the purposes of the…

  2. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  3. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  4. Applying the ID Process to the Guided Design Teaching Strategy.

    ERIC Educational Resources Information Center

    Coscarelli, William C.; White, Gregory P.

    1982-01-01

    Describes the application of the instructional development process to a teaching technique called Guided Design in a Production-Operations Management course. In Guided Design, students are self-instructed in course content and use class time to apply this knowledge to self-instruction; in-class problem-solving is stressed. (JJD)

  5. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi; Tung, Yuanki

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  6. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi ); Tung, Yuanki )

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  7. Design, control and in situ visualization of gas nitriding processes.

    PubMed

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process.

  8. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  9. Manufacturing process design for multi commodities in agriculture

    NASA Astrophysics Data System (ADS)

    Prasetyawan, Yudha; Santosa, Andrian Henry

    2017-06-01

    High-potential commodities within particular agricultural sectors should be accompanied by maximum benefit value that can be attained by both local farmers and business players. In several cases, the business players are small-medium enterprises (SMEs) which have limited resources to perform added value process of the local commodities into the potential products. The weaknesses of SMEs such as the manual production process with low productivity, limited capacity to maintain prices, and unattractive packaging due to conventional production. Agricultural commodity is commonly created into several products such as flour, chips, crackers, oil, juice, and other products. This research was initiated by collecting data by interview method particularly to obtain the perspectives of SMEs as the business players. Subsequently, the information was processed based on the Quality Function Deployment (QFD) to determine House of Quality from the first to fourth level. A proposed design as the result of QFD was produced and evaluated with Technology Assessment Model (TAM) and continued with a revised design. Finally, the revised design was analyzed with financial perspective to obtain the cost structure of investment, operational, maintenance, and workers. The machine that performs manufacturing process, as the result of revised design, was prototyped and tested to determined initial production process. The designed manufacturing process offers IDR 337,897, 651 of Net Present Value (NPV) in comparison with the existing process value of IDR 9,491,522 based on similar production input.

  10. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  11. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  12. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  13. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  14. The start up as a phase of architectural design process.

    PubMed

    Castro, Iara Sousa; Lima, Francisco de Paula Antunes; Duarte, Francisco José de Castro Moura

    2012-01-01

    Alterations made in the architectural design can be considered as a continuous process, from its conception to the moment a built environment is already in use. This article focuses on the "moving phase", which is the initial moment of the environment occupation and the start-up of services. It aims to show that the continuity of ergonomics interventions during the "moving phase" or start up may reveal the built environment inadequacies; clearly showing needs not met by the design and allowing making instant decisions to solve non-foreseen problems. The results have revealed some lessons experienced by users during a critical stage not usually included in the design process.

  15. Development of the multichannel data processing ASIC design flow

    NASA Astrophysics Data System (ADS)

    Ivanov, P. Y.; Atkin, E. V.; Normanov, D. D.; Shumkin, O. V.

    2017-01-01

    In modern multichannel data processing digital systems the number of channels ranges from some hundred thousand to millions. The basis of the elemental base of these systems are ASICs. Their most important characteristics are performance, power consumption and occupied area. ASIC design is a time and labor consuming process. In order to improve performance and reduce the designing time it is proposed to supplement the standard design flow with an optimization stage of the channel parameters based on the most efficient use of chip area and power consumption.

  16. Application of hazard assessment techniques in the CISF design process

    SciTech Connect

    Thornton, J.R.; Henry, T.

    1997-10-29

    The Department of Energy has submitted to the NRC staff for review a topical safety analysis report (TSAR) for a Centralized Interim Storage Facility (CISF). The TSAR will be used in licensing the CISF when and if a site is designated. CISF1 design events are identified based on thorough review of design basis events (DBEs) previously identified by dry storage system suppliers and licensees and through the application of hazard assessment techniques. A Preliminary Hazards Assessment (PHA) is performed to identify design events applicable to a Phase 1 non site specific CISF. A PHA is deemed necessary since the Phase 1 CISF is distinguishable from previous dry store applications in several significant operational scope and design basis aspects. In addition to assuring all design events applicable to the Phase 1 CISF are identified, the PHA served as an integral part of the CISF design process by identifying potential important to safety and defense in depth facility design and administrative control features. This paper describes the Phase 1 CISF design event identification process and summarizes significant PHA contributions to the CISF design.

  17. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  18. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  19. Optimization of composite wood structural components : processing and design choices

    Treesearch

    Theodore L. Laufenberg

    1985-01-01

    Decreasing size and quality of the world's forest resources are responsible for interest in producing composite wood structural components. Process and design optimization methods are offered in this paper. Processing concepts for wood composite structural products are reviewed to illustrate manufacturing boundaries and areas of high potential. Structural...

  20. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  1. Process monitoring interface for studying the metamorphism in a design

    SciTech Connect

    Bayrak, C.

    1996-12-31

    The efforts to improve the system design methodology, which provides the designer with the ability to exercise the prototype at a high abstraction level and to delay the implementation level activity as far into the development activity as possible, have led us to study the process monitoring issue. Particularly we are interested in the three fundamental issues in process monitoring: the abstract sphere of the design, the practical sphere of the user interface, and the gap between these two spheres. Therefore, an integrated graphical user interface architecture, called Process Monitoring Interface (PMI), is introduced not only for the gap between the abstract spheres and practical affairs of building appropriate user-integrated interface but also for supervising the hierarchical human design notion of the abstract design exercised with more efficiency at the highest possible levels of the development. In using PMI, the user/developer can monitor evolution of highly abstract building blocks that are created at the very beginning of the design process and later be refined into different levels of the design, representing different levels of abstractions of a system.

  2. 3D Printed Surgical Instruments: The Design and Fabrication Process.

    PubMed

    George, Mitchell; Aroom, Kevin R; Hawes, Harvey G; Gill, Brijesh S; Love, Joseph

    2017-01-01

    3D printing is an additive manufacturing process allowing the creation of solid objects directly from a digital file. We believe recent advances in additive manufacturing may be applicable to surgical instrument design. This study investigates the feasibility, design and fabrication process of usable 3D printed surgical instruments. The computer-aided design package SolidWorks (Dassault Systemes SolidWorks Corp., Waltham MA) was used to design a surgical set including hemostats, needle driver, scalpel handle, retractors and forceps. These designs were then printed on a selective laser sintering (SLS) Sinterstation HiQ (3D Systems, Rock Hill SC) using DuraForm EX plastic. The final printed products were evaluated by practicing general surgeons for ergonomic functionality and performance, this included simulated surgery and inguinal hernia repairs on human cadavers. Improvements were identified and addressed by adjusting design and build metrics. Repeated manufacturing processes and redesigns led to the creation of multiple functional and fully reproducible surgical sets utilizing the user feedback of surgeons. Iterative cycles including design, production and testing took an average of 3 days. Each surgical set was built using the SLS Sinterstation HiQ with an average build time of 6 h per set. Functional 3D printed surgical instruments are feasible. Advantages compared to traditional manufacturing methods include no increase in cost for increased complexity, accelerated design to production times and surgeon specific modifications.

  3. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals.

  4. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  5. System Design Support by Optimization Method Using Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  6. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  7. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  8. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  9. Economic design of control charts considering process shift distributions

    NASA Astrophysics Data System (ADS)

    Vommi, Vijayababu; Kasarapu, Rukmini V.

    2014-09-01

    Process shift is an important input parameter in the economic design of control charts. Earlier control chart designs considered constant shifts to occur in the mean of the process for a given assignable cause. This assumption has been criticized by many researchers since it may not be realistic to produce a constant shift whenever an assignable cause occurs. To overcome this difficulty, in the present work, a distribution for the shift parameter has been considered instead of a single value for a given assignable cause. Duncan's economic design model for chart has been extended to incorporate the distribution for the process shift parameter. It is proposed to minimize total expected loss-cost to obtain the control chart parameters. Further, three types of process shifts namely, positively skewed, uniform and negatively skewed distributions are considered and the situations where it is appropriate to use the suggested methodology are recommended.

  10. Designing persuasive health materials using processing fluency: a literature review.

    PubMed

    Okuhara, Tsuyoshi; Ishikawa, Hirono; Okada, Masahumi; Kato, Mio; Kiuchi, Takahiro

    2017-06-08

    Health materials to promote health behaviors should be readable and generate favorable evaluations of the message. Processing fluency (the subjective experience of ease with which people process information) has been increasingly studied over the past decade. In this review, we explore effects and instantiations of processing fluency and discuss the implications for designing effective health materials. We searched seven online databases using "processing fluency" as the key word. In addition, we gathered relevant publications using reference snowballing. We included published records that were written in English and applicable to the design of health materials. We found 40 articles that were appropriate for inclusion. Various instantiations of fluency have a uniform effect on human judgment: fluently processed stimuli generate positive judgments (e.g., liking, confidence). Processing fluency is used to predict the effort needed for a given task; accordingly, it has an impact on willingness to undertake the task. Physical perceptual, lexical, syntactic, phonological, retrieval, and imagery fluency were found to be particularly relevant to the design of health materials. Health-care professionals should consider the use of a perceptually fluent design, plain language, numeracy with an appropriate degree of precision, a limited number of key points, and concrete descriptions that make recipients imagine healthy behavior. Such fluently processed materials that are easy to read and understand have enhanced perspicuity and persuasiveness.

  11. Design flow for implementing image processing in FPGAs

    NASA Astrophysics Data System (ADS)

    Trakalo, M.; Giles, G.

    2007-04-01

    A design flow for implementing a dynamic gamma algorithm in an FPGA is described. Real-time video processing makes enormous demands on processing resources. An FPGA solution offers some advantages over commercial video chip and DSP implementation alternatives. The traditional approach to FPGA development involves a system engineer designing, modeling and verifying an algorithm and writing a specification. A hardware engineer uses the specification as a basis for coding in VHDL and testing the algorithm in the FPGA with supporting electronics. This process is work intensive and the verification of the image processing algorithm executing on the FPGA does not occur until late in the program. The described design process allows the system engineer to design and verify a true VHDL version of the algorithm, executing in an FPGA. This process yields reduced risk and development time. The process is achieved by using Xilinx System Generator in conjunction with Simulink® from The MathWorks. System Generator is a tool that bridges the gap between the high level modeling environment and the digital world of the FPGA. System Generator is used to develop the dynamic gamma algorithm for the contrast enhancement of a candidate display product. The results of this effort are to increase the dynamic range of the displayed video, resulting in a more useful image for the user.

  12. Space Station Freedom Program preliminary design review process

    NASA Technical Reports Server (NTRS)

    Carlise, R. F.; Adair, Billy

    1989-01-01

    To conduct the Program Requirements Review of the Space Station Freedom, a Preliminary Design Review Board (PDR) has been established. The PDR will assess the preliminary design of the assembled manned base including the assembly process, the launch, and on-orbit stage configuration, the design approach, the on-orbit verification plans, supportability, reliability, safety, interfaces with the NASA infrastructure (the NSTS, TDRSS, and Ground operations) and international partners. Issues such as the coordination of a common interpretation of design requirements, coordination of interfaces, and convergence of design perspectives vs. proper allocation of resources are discussed. The impact of the resolution of the secondary ripple effect of design decisions which may cause programmatic difficulties is also addressed.

  13. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  14. Expectation changes and team characteristics in a participatory design process.

    PubMed

    Bazley, Conne Mara; De Jong, Annelise; Vink, Peter

    2012-01-01

    A human factors specialist researched the expectations of a culturally and professionally diverse team throughout a year long participatory design process of a large processing facility. For a deeper understanding of high-level team expectations and characteristics, the specialist collected data and information through in-situ ethnography and traditional case study methods, personal interviews, and a questionnaire that included a likert scale rating for expectation levels. Results found that expectation levels rated extremely satisfied for individual team members and the overall team itself before and during the participatory process. In contrast, expectations for upper management from the team were satisfied before the participatory process, but changed to uncertain, to unsatisfied, to extremely unsatisfied during the process. Additionally, the participatory design team exhibited high-level team characteristics to include honesty, competence, commitment, communication, creativity, and clear expectations.

  15. Expectation changes and team characteristics in a participatory design process.

    PubMed

    Bazley, Conne Mara; De Jong, Annelise; Vink, Peter

    2012-01-01

    A human factors specialist researched the expectations of a culturally and professionally diverse team throughout a year long participatory design process of a large processing facility. For a deeper understanding of high-level team expectations and characteristics, the specialist collected data and information through in-situ ethnography and traditional case study methods, personal interviews, and a questionnaire that included a likert scale rating for expectation levels. Results found that expectation levels rated extremely satisfied for individual team members and the overall team itself before and during the participatory process. In contrast, expectations for upper management from the team were satisfied before the participatory process, but changed to uncertain, to unsatisfied, to extremely unsatisfied during the process. Additionally, the participatory design team exhibited high-level team characteristics to include honesty, competence, commitment, communication, creativity, and clear expectations.

  16. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  17. Conceptual process designs: Lurgi-Ruhrgas and superior circular grate

    SciTech Connect

    Not Available

    1980-09-01

    Based on preliminary data on retort yields and previous conceptual designs, a Design Basis has been prepared, and an upgrading scheme developed against which all five retorting processes can be evaluated. Licensors for retorting technologies (American Lurgi Corporation for the Lurgi retort, Davy-McKee for the Superior Circular Grate retort, and Union Oil Company for the Union B retort); hydrotreaters (Gulf Oil and Chevron); wastewater treatment units (Chevron); and sulfur plants (Parsons) have been contacted for data related to their processes. Preliminary balances for the Lurgi and Superior processes have been developed and will be compared against the vendor information when received. A preliminary design basis is presented. This presents design assumptions and conditions to be used in developing the process designs, heat and material balances, and process flow diagrams for all cases. The shale oil upgrading scheme selected to be used in all evaluations consists of delayed coking the 850/sup 0/F plus fraction from the shale oil, and hydrotreating all virgin and coker naphthas and gas oils in separate hydrotreaters. This scheme was selected because it is simple and each of the units has proven to be reliable in refining conventional crude oils. Also, this upgrading scheme is not expected to penalize any specific retort system. The material and utility balances, along with process flow diagrams for Case I, the Lurgi-Ruhrgas process are given. In this case, 46,500 bpsd of 29.4 /sup 0/API upgraded shale oil are produced. The Superior Circular Grate material and utility balances and process flow diagrams are also given. The liquid product from this case is 40,500 bpsd of 27.4 /sup 0/API upgraded shale oil.

  18. Advanced Test Reactor Design Basis Reconstitution Project Issue Resolution Process

    SciTech Connect

    Steven D. Winter; Gregg L. Sharp; William E. Kohn; Richard T. McCracken

    2007-05-01

    The Advanced Test Reactor (ATR) Design Basis Reconstitution Program (DBRP) is a structured assessment and reconstitution of the design basis for the ATR. The DBRP is designed to establish and document the ties between the Document Safety Analysis (DSA), design basis, and actual system configurations. Where the DBRP assessment team cannot establish a link between these three major elements, a gap is identified. Resolutions to identified gaps represent configuration management and design basis recovery actions. The proposed paper discusses the process being applied to define, evaluate, report, and address gaps that are identified through the ATR DBRP. Design basis verification may be performed or required for a nuclear facility safety basis on various levels. The process is applicable to large-scale design basis reconstitution efforts, such as the ATR DBRP, or may be scaled for application on smaller projects. The concepts are applicable to long-term maintenance of a nuclear facility safety basis and recovery of degraded safety basis components. The ATR DBRP assessment team has observed numerous examples where a clear and accurate link between the DSA, design basis, and actual system configuration was not immediately identifiable in supporting documentation. As a result, a systematic approach to effectively document, prioritize, and evaluate each observation is required. The DBRP issue resolution process provides direction for consistent identification, documentation, categorization, and evaluation, and where applicable, entry into the determination process for a potential inadequacy in the safety analysis (PISA). The issue resolution process is a key element for execution of the DBRP. Application of the process facilitates collection, assessment, and reporting of issues identified by the DBRP team. Application of the process results in an organized database of safety basis gaps and prioritized corrective action planning and resolution. The DBRP team follows the ATR

  19. Prodrugs design based on inter- and intramolecular chemical processes.

    PubMed

    Karaman, Rafik

    2013-12-01

    This review provides the reader a concise overview of the majority of prodrug approaches with the emphasis on the modern approaches to prodrug design. The chemical approach catalyzed by metabolic enzymes which is considered as widely used among all other approaches to minimize the undesirable drug physicochemical properties is discussed. Part of this review will shed light on the use of molecular orbital methods such as DFT, semiempirical and ab initio for the design of novel prodrugs. This novel prodrug approach implies prodrug design based on enzyme models that were utilized for mimicking enzyme catalysis. The computational approach exploited for the prodrug design involves molecular orbital and molecular mechanics (DFT, ab initio, and MM2) calculations and correlations between experimental and calculated values of intramolecular processes that were experimentally studied to assign the factors determining the reaction rates in certain processes for better understanding on how enzymes might exert their extraordinary catalysis.

  20. The design of bearing processing technology and fixture

    NASA Astrophysics Data System (ADS)

    Liu, Sibo

    2017-03-01

    This paper is designed for bearing processing technology and fixture. The main task is to work out the half fine milling under 36mm, Φ18 holes, bearing the processing technology of the rules of procedure, and write CARDS. Its parts are casting, which is small and of simple structure. Moreover, the components of the hole processing is higher than that of the surface, so the processing order of surface first is taken. The fixture special jig is adopted in each working procedure, among which in a drill Φ18 holes, the hydraulic clamping is used, which is simple, convenient and can meet the requirements.

  1. Rethinking the Systems Engineering Process in Light of Design Thinking

    DTIC Science & Technology

    2016-04-30

    qÜáêíÉÉåíÜ=^ååì~ä= ^Åèìáëáíáçå=oÉëÉ~êÅÜ= póãéçëáìã= tÉÇåÉëÇ~ó=pÉëëáçåë= sçäìãÉ=f= = Rethinking the Systems Engineering Process in Light of Design Thinking...Rethinking the Systems Engineering Process in Light of Design Thinking Ronald Giachetti, Chair and Professor, NPS Clifford Whitcomb, Professor, NPS...Engineering Process in Light of Design Thinking Ronald E. Giachetti—is the Chair and Professor of the Systems Engineering Department at the Naval

  2. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  3. Process-induced bias: a study of resist design, device node, illumination conditions, and process implications

    NASA Astrophysics Data System (ADS)

    Carcasi, Michael; Scheer, Steven; Fonseca, Carlos; Shibata, Tsuyoshi; Kosugi, Hitoshi; Kondo, Yoshihiro; Saito, Takashi

    2009-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer systematic variations, compensation by exposure dose and/or post exposure bake (PEB) temperature have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In one previous study limited to a single resist and minimal coater/developer and scanner variations, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, it was found that using PEB temperature to control CD across wafer was preferable to using dose compensation. In another previous study, the impact of resist design was explored to understand how resist design, as well as coater/developer and scanner processing, impact process induced bias (PIB). The previous PIB studies were limited to a single illumination case and explore the effect of PIB on only L/S structures. It is the goal of this work to understand additionally how illumination design and mask design, as well as resist design and coater/developer and scanner processing, impact process induced bias (PIB)/OPC integrity.

  4. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  5. System design considerations for free-fall materials processing

    NASA Technical Reports Server (NTRS)

    Seidensticker, R. G.

    1974-01-01

    The design constraints for orbiting materials processing systems are dominated by the limitations of the flight vehicle/crew and not by the processes themselves. Although weight, size and power consumption are all factors in the design of normal laboratory equipment, their importance is increased orders of magnitude when the equipment must be used in an orbital facility. As a result, equipment intended for space flight may have little resemblance to normal laboratory apparatus although the function to be performed may be identical. The same considerations influence the design of the experiment itself. The processing requirements must be carefully understood in terms of basic physical parameters rather than defined in terms of equipment operation. Preliminary experiments and analysis are much more vital to the design of a space experiment than they are on earth where iterative development is relatively easy. Examples of these various considerations are illustrated with examples from the M518 and MA-010 systems. While these are specific systems, the conclusions apply to the design of flight materials processing systems both present and future.

  6. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  7. Designing future products: what difficulties do designers encounter and how can their creative process be supported?

    PubMed

    Bonnardel, Nathalie

    2012-01-01

    To remain competitive, companies must regularly offer new products to consumers. A major challenge for designers is therefore to come up with design solutions and define products that are both new and adapted to future users and usages. Although classic methods and ergonomic recommendations are useful in most run-of-the-mill design contexts, they are of limited benefit when the design situation requires greater creativity. This paper therefore addresses issues related to product design by pursuing a triple objective: (1) highlight the difficulties encountered by designers in imagining and conceiving new products, (2) find out which conditions could help designers come up with creative ideas for innovative products, and (3) suggest methods and tools to support designers' creative process and help them take other stakeholders' needs and expectations into consideration.

  8. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  9. Aerospace structural design process improvement using systematic evolutionary structural modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Robert Michael

    2000-10-01

    A multidisciplinary team tasked with an aircraft design problem must understand the problem requirements and metrics to produce a successful design. This understanding entails not only knowledge of what these requirements and metrics are, but also how they interact, which are most important (to the customer as well as to aircraft performance), and who in the organization can provide pertinent knowledge for each. In recent years, product development researchers and organizations have developed and successfully applied a variety of tools such as Quality Function Deployment (QFD) to coordinate multidisciplinary team members. The effectiveness of these methods, however, depends on the quality and fidelity of the information that team members can input. In conceptual aircraft design, structural information is of lower quality compared to aerodynamics or performance because it is based on experience rather than theory. This dissertation shows how advanced structural design tools can be used in a multidisciplinary team setting to improve structural information generation and communication through a systematic evolution of structural detail. When applied to conceptual design, finite element-based structural design tools elevate structural information to the same level as other computationally supported disciplines. This improved ability to generate and communicate structural information enables a design team to better identify and meet structural design requirements, consider producibility issues earlier, and evaluate structural concepts. A design process experiment of a wing structural layout in collaboration with an industrial partner illustrates and validates the approach.

  10. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  11. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  12. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  13. Rethinking ASIC design with next generation lithography and process integration

    NASA Astrophysics Data System (ADS)

    Vaidyanathan, Kaushik; Liu, Renzhi; Liebmann, Lars; Lai, Kafai; Strojwas, Andrzej; Pileggi, Larry

    2013-03-01

    Given the deployment delays for EUV, several next generation lithography (NGL) options are being actively researched. Several cost-effective NGL solutions, such as self-aligned double patterning through sidewall image transfer (SIT) and directed self-assembly (DSA), in conjunction with process integration challenges, mandate grating-like pattern design. As part of the GRATEdd project, we have evaluated the design cost of grating-based design for ASICs (application specific ICs). Based on our observations we have engineered fundamental changes to the primary ASIC design components to make scaling affordable and useful in deeply scaled sub-20 nm technologies: unidirectional-M1 based standard cells, application-specific smart SRAM synthesis, and statistical and self-healing analog design.

  14. Rethinking behavioral health processes by using design for six sigma.

    PubMed

    Lucas, Anthony G; Primus, Kelly; Kovach, Jamison V; Fredendall, Lawrence D

    2015-02-01

    Clinical evidence-based practices are strongly encouraged and commonly utilized in the behavioral health community. However, evidence-based practices that are related to quality improvement processes, such as Design for Six Sigma, are often not used in behavioral health care. This column describes the unique partnership formed between a behavioral health care provider in the greater Pittsburgh area, a nonprofit oversight and monitoring agency for behavioral health services, and academic researchers. The authors detail how the partnership used the multistep process outlined in Design for Six Sigma to completely redesign the provider's intake process. Implementation of the redesigned process increased access to care, decreased bad debt and uncollected funds, and improved cash flow--while consumer satisfaction remained high.

  15. Visualization System Requirements for Data Processing Pipeline Design and Optimization.

    PubMed

    von Landesberger, Tatiana; Fellner, Dieter; Ruddle, Roy

    2016-08-25

    The rising quantity and complexity of data creates a need to design and optimize data processing pipelines - the set of data processing steps, parameters and algorithms that perform operations on the data. Visualization can support this process but, although there are many examples of systems for visual parameter analysis, there remains a need to systematically assess users' requirements and match those requirements to exemplar visualization methods. This article presents a new characterization of the requirements for pipeline design and optimization. This characterization is based on both a review of the literature and first-hand assessment of eight application case studies. We also match these requirements with exemplar functionality provided by existing visualization tools. Thus, we provide end-users and visualization developers with a way of identifying functionality that addresses data processing problems in an application. We also identify seven future challenges for visualization research that are not met by the capabilities of today's systems.

  16. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  17. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  18. Design of self-processing antimicrobial peptides for plant protection.

    PubMed

    Powell, W A; Catranis, C M; Maynard, C A

    2000-08-01

    Small antimicrobial peptides are excellent candidates for inclusion in self-processing proteins that could be used to confer pathogen resistance in transgenic plants. Antimicrobial peptides as small as 22 amino acids in length have been designed to incorporate the residual amino acids left from protein processing by the tobacco etch virus'(TEVs') NIa protease. Also, by minimizing the length of these peptides and the number of highly hydrophobic residues, haemolytic activity was reduced without affecting the peptide's antimicrobial activity.

  19. Smart Projectiles: Design Guidelines and Development Process Keys to Success

    DTIC Science & Technology

    2010-10-01

    inspections or components improper torque , etc. If these are to be avoided visibility of the manufacturing process is essential. 10 For a design...material properties at even room temperature are unknown or depend upon the mixing of two or more ingredients. The only solution is to create dog ...minute of spaces. With the cautions previously mentioned on wet processes in place, every joint that will be exposed to gun gases must be sealed by

  20. Optimization of Forming Processes in Microstructure Sensitive Design

    NASA Astrophysics Data System (ADS)

    Garmestani, H.; Li, D. S.

    2004-06-01

    Optimization of the forming processes from initial microstructures of raw materials to desired microstructures of final products is an important topic in materials design. Processing path model proposed in this study gives an explicit mathematical solution about how the microstructure evolves during thermomechanical processing. Based on a conservation principle in the orientation space (originally proposed by Bunge), this methodology is independent of the underlying deformation mechanisms. The evolutions of texture coefficients are modeled using a texture evolution matrix calculated from the experimental results. For the same material using the same processing method, the texture evolution matrix is the same. It does not change with the initial texture. This processing path model provides functions of processing paths and streamlines.

  1. Design Considerations for the Construction and Operation of Flour Milling Facilities. Part II: Process Design Considerations

    USDA-ARS?s Scientific Manuscript database

    Flour milling facilities have been the cornerstone of agricultural processing for centuries. Like most agri-industrial production facilities, flour milling facilities have a number of unique design requirements. Design information, to date, has been limited. In an effort to summarize state of the ...

  2. Motivating the Notion of Generic Design within Information Processing Theory: The Design Problem Space.

    ERIC Educational Resources Information Center

    Goel, Vinod; Pirolli, Peter

    The notion of generic design, while it has been around for 25 years, is not often articulated, especially within Newell and Simon's (1972) Information Processing Theory framework. Design is merely lumped in with other forms of problem solving activity. Intuitively it is felt that there should be a level of description of the phenomenon which…

  3. Which Events Can Cause Iteration in Instructional Design? An Empirical Study of the Design Process

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2006-01-01

    Instructional design is not a linear process: designers have to weigh the advantages and disadvantages of alternative solutions, taking into account different kinds of conflicting and changing constraints. To make sure that they eventually choose the most optimal one, they have to keep on collecting information, reconsidering continuously whether…

  4. Which Events Can Cause Iteration in Instructional Design? An Empirical Study of the Design Process

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2006-01-01

    Instructional design is not a linear process: designers have to weigh the advantages and disadvantages of alternative solutions, taking into account different kinds of conflicting and changing constraints. To make sure that they eventually choose the most optimal one, they have to keep on collecting information, reconsidering continuously whether…

  5. Time-Course of Muscle Mass Loss, Damage, and Proteolysis in Gastrocnemius following Unloading and Reloading: Implications in Chronic Diseases

    PubMed Central

    Chacon-Cabrera, Alba; Lund-Palau, Helena; Gea, Joaquim; Barreiro, Esther

    2016-01-01

    Background Disuse muscle atrophy is a major comorbidity in patients with chronic diseases including cancer. We sought to explore the kinetics of molecular mechanisms shown to be involved in muscle mass loss throughout time in a mouse model of disuse muscle atrophy and recovery following immobilization. Methods Body and muscle weights, grip strength, muscle phenotype (fiber type composition and morphometry and muscle structural alterations), proteolysis, contractile proteins, systemic troponin I, and mitochondrial content were assessed in gastrocnemius of mice exposed to periods (1, 2, 3, 7, 15 and 30 days) of non-invasive hindlimb immobilization (plastic splint, I cohorts) and in those exposed to reloading for different time-points (1, 3, 7, 15, and 30 days, R cohorts) following a seven-day period of immobilization. Groups of control animals were also used. Results Compared to non-exposed controls, muscle weight, limb strength, slow- and fast-twitch cross-sectional areas, mtDNA/nDNA, and myosin content were decreased in mice of I cohorts, whereas tyrosine release, ubiquitin-proteasome activity, muscle injury and systemic troponin I levels were increased. Gastrocnemius reloading following splint removal improved muscle mass loss, strength, fiber atrophy, injury, myosin content, and mtDNA/nDNA, while reducing ubiquitin-proteasome activity and proteolysis. Conclusions A consistent program of molecular and cellular events leading to reduced gastrocnemius muscle mass and mitochondrial content and reduced strength, enhanced proteolysis, and injury, was seen in this non-invasive mouse model of disuse muscle atrophy. Unloading of the muscle following removal of the splint significantly improved the alterations seen during unloading, characterized by a specific kinetic profile of molecular events involved in muscle regeneration. These findings have implications in patients with chronic diseases including cancer in whom physical activity may be severely compromised. PMID

  6. Akt-dependent and Akt-independent pathways are involved in protein synthesis activation during reloading of disused soleus muscle.

    PubMed

    Mirzoev, Timur M; Tyganov, Sergey A; Shenkman, Boris S

    2017-03-01

    The purpose of our study was to assess the contribution of insulin growth factor-1-dependent and phosphatidic acid-dependent signaling pathways to activation of protein synthesis (PS) in rat soleus muscle during early recovery from unloading. Wistar rats were divided into: Control, 14HS [14-day hindlimb suspension (HS)], 3R+placebo (3-day reloading + saline administration), 3R+Wort (3-day reloading + wortmannin administration), 3R+But (3-day reloading + 1-butanol administration). SUnSET and Western blot analyses were used in this study. Wortmannin and 1-butanol induced a decrease in protein kinase B (phospho-Akt) and the rate of PS (P < 0.05) versus Control. In 3R+placebo and 3R+Wort, phosphorylation of glycogen synthase kinase 3 beta (phospho-GSK-3β) was increased versus Control (P < 0.05). Wortmannin administration during reloading did not alter phospho-p70S6K (70 kDa ribosomal protein S6 kinase) versus 3R+placebo. In 3R+But, there was a decline in phospho-GSK-3β versus 3R+placebo and Control. In 3R+But, there was a decrease in phopho-p70S6K (P < 0.05) versus 3R+placebo. These results suggest that PS activation during 3-day reloading following 14HS involves both Akt-dependent and Akt-independent pathways. Muscle Nerve 55: 393-399, 2017. © 2016 Wiley Periodicals, Inc.

  7. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  8. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  9. A formulation of metamodel implementation processes for complex systems design

    NASA Astrophysics Data System (ADS)

    Daberkow, Debora Daniela

    Complex systems design poses an interesting as well as demanding information management problem for system level integration and design. The high interconnectivity of disciplines combined with the specific knowledge and expertise in each of these calls for a system level view that is broad, as in spanning across all disciplines, while at the same time detailed enough to do the disciplinary knowledge justice. The treatment of this requires highly evolved information management and decision approaches, which result in design methodologies that can handle this high degree of complexity. The solution is to create models within the design process, which predict meaningful metrics representative of the various disciplinary analyses that can be quickly evaluated and thus serve in system level decision making and optimization. Such models approximate the physics-based analysis codes used in each of the disciplines and are called metamodels since effectively, they model the (physics-based) models on which the disciplinary analysis codes are based. The thesis formulates a new metamodel implementation process to be used in complex systems design, utilizing a Gaussian Process prediction method. It is based on a Bayesian probability and inference approach and as such returns a variance prediction along with the most likely value, thus giving an estimate also for the confidence in the prediction. Within this thesis, the applicability and appropriateness at the theoretical as well as practical level are investigated, and proof-of-concept implementations at the disciplinary and system levels are provided.

  10. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  11. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  12. Noise control, sound, and the vehicle design process

    NASA Astrophysics Data System (ADS)

    Donavan, Paul

    2005-09-01

    For many products, noise and sound are viewed as necessary evils that need to be dealt with in order to bring the product successfully to market. They are generally not product ``exciters'' although some vehicle manufacturers do tune and advertise specific sounds to enhance the perception of their products. In this paper, influencing the design process for the ``evils,'' such as wind noise and road noise, are considered in more detail. There are three ingredients to successfully dealing with the evils in the design process. The first of these is knowing how excesses in noise effects the end customer in a tangible manner and how that effects customer satisfaction and ultimately sells. The second is having and delivering the knowledge of what is required of the design to achieve a satisfactory or even better level of noise performance. The third ingredient is having the commitment of the designers to incorporate the knowledge into their part, subsystem or system. In this paper, the elements of each of these ingredients are discussed in some detail and the attributes of a successful design process are enumerated.

  13. The Role of Dialogic Processes in Designing Career Expectations

    ERIC Educational Resources Information Center

    Bangali, Marcelline; Guichard, Jean

    2012-01-01

    This article examines the role played by dialogic processes in the designing or redesigning of future expectations during a career guidance intervention. It discusses a specific method ("Giving instruction to a double") developed and used during career counseling sessions with two recent doctoral graduates. It intends both to help them outline or…

  14. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    ERIC Educational Resources Information Center

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  15. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Aproximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use tis inform...

  16. A Process Chart to Design Experiential Learning Projects

    ERIC Educational Resources Information Center

    Zhu, Suning; Wu, Yun; Sankar, Chetan S.

    2016-01-01

    A high-impact practice is to incorporate experiential learning projects when teaching difficulty subject matters so as to enhance students' understanding and interest in the course content. But, there is limited research on how to design and execute such projects. Therefore, we propose a framework based on the processes described by the Project…

  17. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  18. Developing 21st Century Process Skills through Project Design

    ERIC Educational Resources Information Center

    Yoo, Jeong-Ju; MacDonald, Nora M.

    2014-01-01

    The goal of this paper is to illustrate how the promotion of 21st Century process skills can be used to enhance student learning and workplace skill development: thinking, problem solving, collaboration, communication, leadership, and management. As an illustrative case, fashion merchandising and design students conducted research for a…

  19. Examining Teacher Thinking: Constructing a Process to Design Curricular Adaptations.

    ERIC Educational Resources Information Center

    Udvari-Solner, Alice

    1996-01-01

    This description of a curricular adaptation decision-making process focuses on tenets of reflective practice as teachers design instruction for students in heterogeneous classrooms. A case example illustrates how an elementary teaching team transformed lessons to accommodate a wide range of learners in a multiage first- and second-grade classroom.…

  20. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  1. Experiential Learning: A Course Design Process for Critical Thinking

    ERIC Educational Resources Information Center

    Hamilton, Janet G.; Klebba, Joanne M.

    2011-01-01

    This article describes a course design process to improve the effectiveness of using experiential learning techniques to foster critical thinking skills. The authors examine prior research to identify essential dimensions of experiential learning in relation to higher order thinking. These dimensions provide key insights for the selection of…

  2. A Process Chart to Design Experiential Learning Projects

    ERIC Educational Resources Information Center

    Zhu, Suning; Wu, Yun; Sankar, Chetan S.

    2016-01-01

    A high-impact practice is to incorporate experiential learning projects when teaching difficulty subject matters so as to enhance students' understanding and interest in the course content. But, there is limited research on how to design and execute such projects. Therefore, we propose a framework based on the processes described by the Project…

  3. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  4. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    ERIC Educational Resources Information Center

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  5. INCORPORATING INDUSTRIAL ECOLOGY INTO HIERARCHICAL CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    Incorporating Industrial Ecology into Hierarchical Chemical Process Design: Determining Targets for the Exchange of Waste

    The exchange of waste to be used as a recycled feed has long been encouraged by practitioners of industrial ecology. Industrial ecology is a field t...

  6. GREENING OF OXIDATION CATALYSIS THROUGH IMPROVED CATALYST AND PROCESS DESIGN

    EPA Science Inventory


    Greening of Oxidation Catalysis Through Improved Catalysts and Process Design
    Michael A. Gonzalez*, Thomas Becker, and Raymond Smith

    United State Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory, 26 W...

  7. GREENING OF OXIDATION CATALYSIS THROUGH IMPROVED CATALYST AND PROCESS DESIGN

    EPA Science Inventory


    Greening of Oxidation Catalysis Through Improved Catalysts and Process Design
    Michael A. Gonzalez*, Thomas Becker, and Raymond Smith

    United State Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory, 26 W...

  8. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Approximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use this infor...

  9. USING GENETIC ALGORITHMS TO DESIGN ENVIRONMENTALLY FRIENDLY PROCESSES

    EPA Science Inventory

    Genetic algorithm calculations are applied to the design of chemical processes to achieve improvements in environmental and economic performance. By finding the set of Pareto (i.e., non-dominated) solutions one can see how different objectives, such as environmental and economic ...

  10. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  11. INCORPORATING INDUSTRIAL ECOLOGY INTO HIERARCHICAL CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    Incorporating Industrial Ecology into Hierarchical Chemical Process Design: Determining Targets for the Exchange of Waste

    The exchange of waste to be used as a recycled feed has long been encouraged by practitioners of industrial ecology. Industrial ecology is a field t...

  12. USING GENETIC ALGORITHMS TO DESIGN ENVIRONMENTALLY FRIENDLY PROCESSES

    EPA Science Inventory

    Genetic algorithm calculations are applied to the design of chemical processes to achieve improvements in environmental and economic performance. By finding the set of Pareto (i.e., non-dominated) solutions one can see how different objectives, such as environmental and economic ...

  13. Ingenuity in Action: Connecting Tinkering to Engineering Design Processes

    ERIC Educational Resources Information Center

    Wang, Jennifer; Werner-Avidon, Maia; Newton, Lisa; Randol, Scott; Smith, Brooke; Walker, Gretchen

    2013-01-01

    The Lawrence Hall of Science, a science center, seeks to replicate real-world engineering at the "Ingenuity in Action" exhibit, which consists of three open-ended challenges. These problems encourage children to engage in engineering design processes and problem-solving techniques through tinkering. We observed and interviewed 112…

  14. Developing 21st Century Process Skills through Project Design

    ERIC Educational Resources Information Center

    Yoo, Jeong-Ju; MacDonald, Nora M.

    2014-01-01

    The goal of this paper is to illustrate how the promotion of 21st Century process skills can be used to enhance student learning and workplace skill development: thinking, problem solving, collaboration, communication, leadership, and management. As an illustrative case, fashion merchandising and design students conducted research for a…

  15. Ingenuity in Action: Connecting Tinkering to Engineering Design Processes

    ERIC Educational Resources Information Center

    Wang, Jennifer; Werner-Avidon, Maia; Newton, Lisa; Randol, Scott; Smith, Brooke; Walker, Gretchen

    2013-01-01

    The Lawrence Hall of Science, a science center, seeks to replicate real-world engineering at the "Ingenuity in Action" exhibit, which consists of three open-ended challenges. These problems encourage children to engage in engineering design processes and problem-solving techniques through tinkering. We observed and interviewed 112…

  16. DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    A catalytic reforming process has been studied using hierarchical design and simulation calculations. Approximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use this infor...

  17. The Design Process of Corporate Universities: A Stakeholder Approach

    ERIC Educational Resources Information Center

    Patrucco, Andrea Stefano; Pellizzoni, Elena; Buganza, Tommaso

    2017-01-01

    Purpose: Corporate universities (CUs) have been experiencing tremendous growth during the past years and can represent a real driving force for structured organizations. The paper aims to define the process of CU design shaped around company strategy. For each step, the authors propose specific roles, activities and methods.…

  18. Design Exploration of Engineered Materials, Products, and Associated Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Shukla, Rishabh; Kulkarni, Nagesh H.; Gautham, B. P.; Singh, Amarendra K.; Mistree, Farrokh; Allen, Janet K.; Panchal, Jitesh H.

    2015-01-01

    In the past few years, ICME-related research has been directed towards the study of multi-scale materials design. However, relatively little has been reported on model-based methods that are of relevance to industry for the realization of engineered materials, products, and associated industrial manufacturing processes. Computational models used in the realization of engineered materials and products are fraught with uncertainty, have different levels of fidelity, are incomplete and are even likely to be inaccurate. In light of this, we adopt a robust design strategy that facilitates the exploration of the solution space thereby providing decision support to a design engineer. In this paper, we describe a foundational construct embodied in our method for design exploration, namely, the compromise Decision Support Problem. We introduce a problem that we are using to establish the efficacy of our method. It involves the integrated design of steel and gears, traversing the chain of steel making, mill production, and evolution of the material during these processes, and linking this to the mechanical design and manufacture of the gear. We provide an overview of our method to determine the operating set points for the ladle, tundish and caster operations necessary to manufacture steel of a desired set of properties. Finally, we highlight the efficacy of our method.

  19. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature.

  20. Design characteristics for facilities which process hazardous particulate

    SciTech Connect

    Abeln, S.P.; Creek, K.; Salisbury, S.

    1998-12-01

    Los Alamos National Laboratory is establishing a research and processing capability for beryllium. The unique properties of beryllium, including light weight, rigidity, thermal conductivity, heat capacity, and nuclear properties make it critical to a number of US defense and aerospace programs. Concomitant with the unique engineering properties are the health hazards associated with processing beryllium in a particulate form and the potential for worker inhalation of aerosolized beryllium. Beryllium has the lowest airborne standard for worker protection compared to all other nonradioactive metals by more than an order of magnitude. This paper describes the design characteristics of the new beryllium facility at Los Alamos as they relate to protection of the workforce. Design characteristics to be reviewed include; facility layout, support systems to minimize aerosol exposure and spread, and detailed review of the ventilation system design for general room air cleanliness and extraction of particulate at the source.

  1. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    PubMed

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  2. Analog integrated circuits design for processing physiological signals.

    PubMed

    Li, Yan; Poon, Carmen C Y; Zhang, Yuan-Ting

    2010-01-01

    Analog integrated circuits (ICs) designed for processing physiological signals are important building blocks of wearable and implantable medical devices used for health monitoring or restoring lost body functions. Due to the nature of physiological signals and the corresponding application scenarios, the ICs designed for these applications should have low power consumption, low cutoff frequency, and low input-referred noise. In this paper, techniques for designing the analog front-end circuits with these three characteristics will be reviewed, including subthreshold circuits, bulk-driven MOSFETs, floating gate MOSFETs, and log-domain circuits to reduce power consumption; methods for designing fully integrated low cutoff frequency circuits; as well as chopper stabilization (CHS) and other techniques that can be used to achieve a high signal-to-noise performance. Novel applications using these techniques will also be discussed.

  3. Optimal design of upstream processes in biotransformation technologies.

    PubMed

    Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K

    2017-01-01

    In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology.

  4. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  5. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  6. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. The Gains Design Process: How to do Structured Design of User Interfaces in Any Software Environment

    NASA Astrophysics Data System (ADS)

    Lindeman, Martha J.

    This paper describes a user-interaction design process created and used by a consultant to solve two challenges: (1) how to decrease the need for changes in the user interface by subsequent system releases without doing big design up-front and (2) how to apply a structured user-interaction design process no matter when brought into a project or what software methodology was being used. The four design levels in the process parallel Beck and Fowler’s four planning levels described in their book Planning Extreme Programming. The design process is called “GAINS” because the user-interaction designer has only Attraction, Information and Navigation to connect users’ Goals with the project sponsors’ criteria for Success. Thus there are five questions, one for each letter of the acronym GAINS, asked at each of four levels of design: The first two design levels, Rough Plan and Big Plan, focus on business-process actions and objects that define users’ goals. The next two levels, Release Planning and Iteration Planning, focus on the user interface objects that support the tasks necessary to achieve those goals. Release Planning identifies the displays the user sees for each goal included in that release, and also the across-display navigation for the proposed functionality. Iteration Planning focuses at a lower level of interaction, such as the within-display navigation among ontrols. For a voice system, the word “sees” would be changed to “hears,” but the design rocess and the levels of focus are the same for user interfaces that are vision output (e.g., GUIs), voice output (e.g., VRs), or multimodal.

  8. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  9. Process-induced bias: a study of resist design and process implications

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Scheer, Steven; Carcasi, Michael; Shibata, Tsuyoshi; Otsuka, Takahisa

    2008-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer variations, compensation by exposure dose and/or PEB temperature, have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In a previous study, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, both metrics demonstrated that using PEB temperature to control across wafer CD variation was preferable to using dose compensation. The previous study was limited to a single resist and variations to track and scanner processing were kept to a minimum. Further examination of additional resist materials has indicated that significant variation in dose and PEB temperature induced CD biases exist from material to material. It is the goal of this work to understand how resist design, as well as track and scanner processing, impact process induced bias (PIB). This is accomplished by analyzing full resist models for a range of resists that exhibit different dose and PEB temperature PIB behavior. From these models, the primary resist design contributors to PIB are isolated. A sensitivity analysis of the primary resist design as well as track and scanner processing effects will also be simulated and presented.

  10. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  11. California State Library: Processing Center Design and Specifications. Volume I, System Description and Input Processing.

    ERIC Educational Resources Information Center

    Sherman, Don; Shoffner, Ralph M.

    The scope of the California State Library-Processing Center (CSL-PC) project is to develop the design and specifications for a computerized technical processing center to provide services to a network of participating California libraries. Immediate objectives are: (1) retrospective conversion of card catalogs to a machine-form data base,…

  12. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  13. CHO gene expression profiling in biopharmaceutical process analysis and design.

    PubMed

    Schaub, Jochen; Clemens, Christoph; Schorn, Peter; Hildebrandt, Tobias; Rust, Werner; Mennerich, Detlev; Kaufmann, Hitto; Schulz, Torsten W

    2010-02-01

    Increase in both productivity and product yields in biopharmaceutical process development with recombinant protein producing mammalian cells can be mainly attributed to the advancements in cell line development, media, and process optimization. Only recently, genome-scale technologies enable a system-level analysis to elucidate the complex biomolecular basis of protein production in mammalian cells promising an increased process understanding and the deduction of knowledge-based approaches for further process optimization. Here, the use of gene expression profiling for the analysis of a low titer (LT) and high titer (HT) fed batch process using the same IgG producing CHO cell line was investigated. We found that gene expression (i) significantly differed in HT versus LT process conditions due to differences in applied chemically defined, serum-free media, (ii) changed over the time course of the fed batch processes, and that (iii) both metabolic pathways and 14 biological functions such as cellular growth or cell death were affected. Furthermore, detailed analysis of metabolism in a standard process format revealed the potential use of transcriptomics for rational media design as is shown for the case of lipid metabolism where the product titer could be increased by about 20% based on a lipid modified basal medium. The results demonstrate that gene expression profiling can be an important tool for mammalian biopharmaceutical process analysis and optimization.

  14. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  15. A Taguchi study of the aeroelastic tailoring design process

    NASA Technical Reports Server (NTRS)

    Bohlmann, Jonathan D.; Scott, Robert C.

    1991-01-01

    A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.

  16. Design of the HTGR for process heat applications

    SciTech Connect

    Vrable, D.L.; Quade, R.N.

    1980-05-01

    This paper discusses a design study of an advanced 842-MW(t) HTGR with a reactor outlet temperature of 850/sup 0/C (1562/sup 0/F), coupled with a chemical process whose product is hydrogen (or a mixture of hydrogen and carbon monoxide) generated by steam reforming of a light hydrocarbon mixture. This paper discusses the plant layout and design for the major components of the primary and secondary heat transfer systems. Typical parametric system study results illustrate the capability of a computer code developed to model the plant performance and economics.

  17. Operation and design of selected industrial process heat field tests

    SciTech Connect

    Kearney, D. W.

    1981-02-01

    The DOE program of solar industrial process heat field tests has shown solar energy to be compatible with numerous industrial needs. Both the operational projects and the detailed designs of systems that are not yet operational have resulted in valuable insights into design and hardware practice. Typical of these insights are the experiences discussed for the four projects reviewed. Future solar IPH systems should benefit greatly not only from the availability of present information, but also from the wealth of operating experience from projects due to start up in 1981.

  18. A Review of the Design Process for Implantable Orthopedic Medical Devices

    PubMed Central

    Aitchison, G.A; Hukins, D.W.L; Parry, J.J; Shepherd, D.E.T; Trotman, S.G

    2009-01-01

    The design process for medical devices is highly regulated to ensure the safety of patients. This paper will present a review of the design process for implantable orthopedic medical devices. It will cover the main stages of feasibility, design reviews, design, design verification, manufacture, design validation, design transfer and design changes. PMID:19662153

  19. Remote Maintenance Design Guide for Compact Processing Units

    SciTech Connect

    Draper, J.V.

    2000-07-13

    Oak Ridge National Laboratory (ORNL) Robotics and Process Systems (RPSD) personnel have extensive experience working with remotely operated and maintained systems. These systems require expert knowledge in teleoperation, human factors, telerobotics, and other robotic devices so that remote equipment may be manipulated, operated, serviced, surveyed, and moved about in a hazardous environment. The RPSD staff has a wealth of experience in this area, including knowledge in the broad topics of human factors, modular electronics, modular mechanical systems, hardware design, and specialized tooling. Examples of projects that illustrate and highlight RPSD's unique experience in remote systems design and application include the following: (1) design of a remote shear and remote dissolver systems in support of U.S. Department of Energy (DOE) fuel recycling research and nuclear power missions; (2) building remotely operated mobile systems for metrology and characterizing hazardous facilities in support of remote operations within those facilities; (3) construction of modular robotic arms, including the Laboratory Telerobotic Manipulator, which was designed for the National Aeronautics and Space Administration (NASA) and the Advanced ServoManipulator, which was designed for the DOE; (4) design of remotely operated laboratories, including chemical analysis and biochemical processing laboratories; (5) construction of remote systems for environmental clean up and characterization, including underwater, buried waste, underground storage tank (UST) and decontamination and dismantlement (D&D) applications. Remote maintenance has played a significant role in fuel reprocessing because of combined chemical and radiological contamination. Furthermore, remote maintenance is expected to play a strong role in future waste remediation. The compact processing units (CPUs) being designed for use in underground waste storage tank remediation are examples of improvements in systems processing

  20. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  1. Design and implementation of a distributed Complex Event Processing system

    NASA Astrophysics Data System (ADS)

    Li, Yan; Shang, Yanlei

    2017-01-01

    Making use of the massive events from event sources such as sensors and bank transactions and extract valuable information is of significant importance. Complex Event Processing (CEP), a method of detecting complex events from simple events stream, provides a solution of processing data in real time fast and efficiently. However, a single node CEP system can't satisfy requirements of processing massive event streams from multitudinous event sources. Therefore, this article designs a distributed CEP system, which combine Siddhi, a CEP engine, and Storm, a distributed real time computation architecture. This system can construct topology automatically based on the event streams and execution plans provided by users and process the event streams parallel. Compared with single node complex event system, the distributed system can achieve better performance.

  2. Mechanistic Fermentation Models for Process Design, Monitoring, and Control.

    PubMed

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-08-21

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Process-based organization design and hospital efficiency.

    PubMed

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  4. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  5. A quantitative approach to nonlinear IC process design rule scaling

    NASA Astrophysics Data System (ADS)

    Gold, Spencer Montgomery

    As minimum dimensions in integrated circuit technologies are reduced beyond 0.1 m m, linear process scaling becomes more difficult and costly. Exponentially rising manufacturing facility and process scaling costs can be better managed by performing nonlinear process shrinks. Nonlinear scaling allows the horizontal design rules to be reduced by different factors according to their ability to provide area and performance improvement in a cost effective manner. This thesis describes a methodology and CAD tools for use in selecting nonlinear design rule reduction ratios that make effective tradeoffs between die cost and performance. The cost effectiveness of nonlinear scaling is demonstrated for a complementary GaAs (CGaAsTM) process. CGaAs is a young technology with coarse design rules that would benefit significantly from a nonlinear shrink. The cost/benefit analysis for scaling the design rules is based on a process-independent optimizing SRAM compiler which was developed as part of this work. The methodology for nonlinear scaling includes identifying the rules which have the greatest impact on circuit area and analyzing the area and performance improvements as these rules are scaled through a range of practical scale factors. Benefit data (product of power and delay improvement ratios) is then combined with die cost estimates at each step to yield the cost/benefit ratio, a quantitative metric for design rule reduction. The slopes and inflection points of cost/benefit vs. scale factor plots guide process engineers in selecting reduction ratios for the various design rules. This procedure should be repeated, using the results of one pass as the starting point for the next. The cost/benefit analysis methodology compares embedded static RAMs that are generated by the PUMA process-independent SRAM compiler. This compiler, which is based on Duet's MasterPortTM layout compactor, can create optimized SRAM cell libraries for any complementary technology. It is capable of

  6. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-03-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  7. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-01-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  8. Applications of fault tree analysis to the design process

    NASA Astrophysics Data System (ADS)

    Youngblood, R. W.

    1988-07-01

    Fault tree analysis of a system can provide a complete characterization of system failure modes, i.e., what combinations of component failures can give rise to system failure. This can be applied to the design process at several levels: (1) confirmatory analysis, in which a fault tree development is used to verify design adequacy, (2) importance analysis, in which fault tree analysis is used to highlight system vulnerabilities, and (3) design optimization, in which fault tree analysis is used to pick the least expensive configuration from a collection of possibilities satisfying a given constraint. Experience shows that the complexity of real systems warrants the systematic and structured development of fault trees for systems whose failure can have severe consequences.

  9. Improving Tools and Processes in Mechanical Design Collaboration

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2009-01-01

    Cooperative product development projects in the aerospace and defense industry are held hostage to high cost and risk due to poor alignment of collaborative design tools and processes. This impasse can be broken if companies will jointly develop implementation approaches and practices in support of high value working arrangements. The current tools can be used to better advantage in many situations and there is reason for optimism that tool vendors will provide significant support.

  10. Improving Tools and Processes in Mechanical Design Collaboration

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2009-01-01

    Cooperative product development projects in the aerospace and defense industry are held hostage to high cost and risk due to poor alignment of collaborative design tools and processes. This impasse can be broken if companies will jointly develop implementation approaches and practices in support of high value working arrangements. The current tools can be used to better advantage in many situations and there is reason for optimism that tool vendors will provide significant support.

  11. Energy codes and the building design process: Opportunities for improvement

    SciTech Connect

    Sandahl, L.J.; Shankle, D.L.; Rigler, E.J.

    1994-05-01

    The Energy Policy Act (EPAct), passed by Congress in 1992, requires states to adopt building energy codes for new commercial buildings that meet or exceed the American Society of Heating, Refrigerating, and Air Conditioning Engineers (ASHRAE) and Illuminating Engineers Society of North America (IES) Standard 90.1-1989 by October 24, 1994. In response to EPAct many states will be adopting a state-wide energy code for the first time. Understanding the role of stakeholders in the building design process is key to the successful implementation of these codes. In 1993, the Pacific Northwest Laboratory (PNL) conducted a survey of architects and designers to determine how much they know about energy codes, to what extent energy-efficiency concerns influence the design process, and how they convey information about energy-efficient designs and products to their clients. Findings of the PNL survey, together with related information from a survey by the American Institute of Architects (AIA) and other reports, are presented in this report. This information may be helpful for state and utility energy program managers and others who will be involved in promoting the adoption and implementation of state energy codes that meet the requirements of EPAct.

  12. Design of multichannel image processing on the Space Solar Telescope

    NASA Astrophysics Data System (ADS)

    Zhang, Bin

    2000-07-01

    The multi-channel image processing system on the Space Solar Telescope (SST) is described in this paper. This system is main part of science data unit (SDU), which is designed for dealing with the science data from every payload on the SST. First every payload on the SST and its scientific objective are introduced. They are main optic telescope, four soft X- ray telescopes, an H-alpha and white light (full disc) telescope, a coronagraph, a wide band X-ray and Gamma-ray spectrometer, and a solar and interplanetary radio spectrometer. Then the structure of SDU is presented. In this part, we discuss the hardware and software structure of SDU, which is designed for multi-payload. The science data scream of every payload is summarized, too. Solar magnetic and velocity field processing that occupies more than 90% of the data processing of SDU is discussed, which includes polarizing unit, image receiver and image adding unit. Last the plan of image data compression and mass memory that is designed for science data storage are presented.

  13. High performance cluster system design for remote sensing data processing

    NASA Astrophysics Data System (ADS)

    Shi, Yuanli; Shen, Wenming; Xiong, Wencheng; Fu, Zhuo; Xiao, Rulin

    2012-10-01

    During recent years, cluster systems have played a more important role in the architecture design of high-performance computing area which is cost-effective and efficient parallel computing system able to satisfy specific computational requirements in the earth and space sciences communities. This paper presents a powerful cluster system built by Satellite Environment Center, Ministry of Environment Protection of China that is designed to process massive remote sensing data of HJ-1 satellites automatically everyday. The architecture of this cluster system including hardware device layer, network layer, OS/FS layer, middleware layer and application layer have been given. To verify the performance of our cluster system, image registration has been chose to experiment with one scene of HJ-1 CCD sensor. The experiments of imagery registration shows that it is an effective system to improve the efficiency of data processing, which could provide a response rapidly in applications that certainly demand, such as wild land fire monitoring and tracking, oil spill monitoring, military target detection, etc. Further work would focus on the comprehensive parallel design and implementations of remote sensing data processing.

  14. Space Station Freedom pressurized element interior design process

    NASA Technical Reports Server (NTRS)

    Hopson, George D.; Aaron, John; Grant, Richard L.

    1990-01-01

    The process used to develop the on-orbit working and living environment of the Space Station Freedom has some very unique constraints and conditions to satisfy. The goal is to provide maximum efficiency and utilization of the available space, in on-orbit, zero G conditions that establishes a comfortable, productive, and safe working environment for the crew. The Space Station Freedom on-orbit living and working space can be divided into support for three major functions: (1) operations, maintenance, and management of the station; (2) conduct of experiments, both directly in the laboratories and remotely for experiments outside the pressurized environment; and (3) crew related functions for food preparation, housekeeping, storage, personal hygiene, health maintenance, zero G environment conditioning, and individual privacy, and rest. The process used to implement these functions, the major requirements driving the design, unique considerations and constraints that influence the design, and summaries of the analysis performed to establish the current configurations are described. Sketches and pictures showing the layout and internal arrangement of the Nodes, U.S. Laboratory and Habitation modules identify the current design relationships of the common and unique station housekeeping subsystems. The crew facilities, work stations, food preparation and eating areas (galley and wardroom), and exercise/health maintenance configurations, waste management and personal hygiene area configuration are shown. U.S. Laboratory experiment facilities and maintenance work areas planned to support the wide variety and mixtures of life science and materials processing payloads are described.

  15. Application of concept selection methodology in IC process design

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Kul

    1993-01-01

    Search for an effective methodology practical in IC manufacturing process development led to trial of quantitative 'concept selection' methodology in selecting the 'best' alternative for interlevel dielectric (ILD) processes. A cross-functional team selected multi-criteria with scoring guidelines to be used in the definition of the 'best'. The project was targeted for the 3 level metal backend process for sub-micron gate array product. The outcome of the project showed that the maturity of the alternatives has strong influence on the scores, because scores on the adopted criteria such as yield, reliability and maturity will depend on the maturity of a particular process. At the same time, the project took longer than expected since it required data for the multiple criteria. These observations suggest that adopting a simpler procedure that can analyze total inherent controllability of a process would be more effective. The methodology of the DFS (design for simplicity) tools used in analyzing the manufacturability of such electronics products as computers, phones and other consumer electronics products could be used as an 'analogy' in constructing an evaluation method for IC processes that produce devices used in those electronics products. This could be done by focusing on the basic process operation elements rather than the layers that are being built.

  16. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  17. Development of the Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Gruber, Christopher R.

    2004-01-01

    The aerodynamic development of an engine inlet requires a comprehensive program of both wind tunnel testing and Computational Fluid Dynamics (CFD) simulations. To save time and resources, much "testing" is done using CFD before any design ever enters a wind tunnel. The focus of my project this summer is on CFD analysis tool development. In particular, I am working to further develop the capabilities of the Planar Inlet Design and Analysis Process (PINDAP). "PINDAP" is a collection of computational tools that allow for efficient and accurate design and analysis of the aerodynamics about and through inlets that can make use of a planar (two-dimensional or axisymmetric) geometric and flow assumption. PINDAP utilizes the WIND CFD flow solver, which is capable of simulating the turbulent, compressible flow field. My project this summer is a continuation of work that I performed for two previous summers. Two years ago, I used basic features of the PINDAP to design a Mach 5 hypersonic scramjet engine inlet and to demonstrate the feasibility of the PINDAP. The following summer, I worked to develop its geometry and grid generation capabilities to include subsonic and supersonic inlets, complete bodies and cowls, conic leading and trailing edges, as well as airfoils. These additions allowed for much more design flexibility when using the program.

  18. Using process mining for automatic support of clinical pathways design.

    PubMed

    Fernandez-Llatas, Carlos; Valdivieso, Bernardo; Traver, Vicente; Benedi, Jose Miguel

    2015-01-01

    The creation of tools supporting the automatization of the standardization and continuous control of healthcare processes can become a significant helping tool for clinical experts and healthcare systems willing to reduce variability in clinical practice. The reduction in the complexity of design and deployment of standard Clinical Pathways can enhance the possibilities for effective usage of computer assisted guidance systems for professionals and assure the quality of the provided care. Several technologies have been used in the past for trying to support these activities but they have not been able to generate the disruptive change required to foster the general adoption of standardization in this domain due to the high volume of work, resources, and knowledge required to adequately create practical protocols that can be used in practice. This chapter proposes the use of the PALIA algorithm, based in Activity-Based process mining techniques, as a new technology to infer the actual processes from the real execution logs to be used in the design and quality control of healthcare processes.

  19. System Design For A Dental Image Processing System

    NASA Astrophysics Data System (ADS)

    Cady, Fredrick M.; Stover, John C.; Senecal, William J.

    1988-12-01

    An image processing system for a large clinic dental practice has been designed and tested. An analysis of spatial resolution requirements and field tests by dentists show that a system built with presently available, PC-based, image processing equipment can provide diagnostic quality images without special digital image processing. By giving the dentist a tool to digitally enhance x-ray images, increased diagnostic capabilities can be achieved. Very simple image processing procedures such as linear and non-linear contrast expansion, edge enhancement, and image zooming can be shown to be very effective. In addition to providing enhanced imagery in the dentist's treatment room, the system is designed to be a fully automated, dental records management system. It is envisioned that a patient's record, including x-rays and tooth charts, may be retrieved from optical disk storage as the patient enters the office. Dental procedures undertaken during the visit may be entered into the record via the imaging workstation by the dentist or the dental assistant. Patient billing and records keeping may be generated automatically.

  20. Moving bed biofilm reactor technology: process applications, design, and performance.

    PubMed

    McQuarrie, James P; Boltz, Joshua P

    2011-06-01

    The moving bed biofilm reactor (MBBR) can operate as a 2- (anoxic) or 3-(aerobic) phase system with buoyant free-moving plastic biofilm carriers. These systems can be used for municipal and industrial wastewater treatment, aquaculture, potable water denitrification, and, in roughing, secondary, tertiary, and sidestream applications. The system includes a submerged biofilm reactor and liquid-solids separation unit. The MBBR process benefits include the following: (1) capacity to meet treatment objectives similar to activated sludge systems with respect to carbon-oxidation and nitrogen removal, but requires a smaller tank volume than a clarifier-coupled activated sludge system; (2) biomass retention is clarifier-independent and solids loading to the liquid-solids separation unit is reduced significantly when compared with activated sludge systems; (3) the MBBR is a continuous-flow process that does not require a special operational cycle for biofilm thickness, L(F), control (e.g., biologically active filter backwashing); and (4) liquid-solids separation can be achieved with a variety of processes, including conventional and compact high-rate processes. Information related to system design is fragmented and poorly documented. This paper seeks to address this issue by summarizing state-of-the art MBBR design procedures and providing the reader with an overview of some commercially available systems and their components.

  1. Epigallocatechin-3-gallate increases autophagy signaling in resting and unloaded plantaris muscles but selectively suppresses autophagy protein abundance in reloaded muscles of aged rats.

    PubMed

    Takahashi, Hideyuki; Suzuki, Yutaka; Mohamed, Junaith S; Gotoh, Takafumi; Pereira, Suzette L; Alway, Stephen E

    2017-03-07

    We have previously found that Epigallocatechin-3-gallate (EGCg), an abundant catechin in green tea, reduced apoptotic signaling and improved muscle recovery in response to reloading after hindlimb suspension (HS). In this study, we investigated if EGCg altered autophagy signaling in skeletal muscle of old rats in response to HS or reloading after HS. Fischer 344×Brown Norway inbred rats (age 34months) were given 1ml/day of purified EGCg (50mg/kg body weight), or the same sample volume of the vehicle by gavage. One group of animals received HS for 14days and the second group of rats received 14days of HS, then the HS was removed and they were allowed to recover by ambulating normally around the cage for two weeks. EGCg decreased a small number of autophagy genes in control muscles, but it increased the expression of other autophagy genes (e.g., ATG16L2, SNCA, TM9SF1, Pink1, PIM-2) and HS did not attenuate these increases. HS increased Beclin1, ATG7 and LC3-II/I protein abundance in hindlimb muscles. Relative to vehicle treatment, EGCg treatment had greater ATG12 protein abundance (35.8%, P<0.05), but decreased Beclin1 protein levels (-101.1%, P<0.05) after HS. However, in reloaded muscles, EGCg suppressed Beclin1 and LC3-II/I protein abundance as compared to vehicle treated muscles. EGCg appeared to "prime" autophagy signaling before and enhance autophagy gene expression and protein levels during unloading in muscles of aged rats, perhaps to improve the clearance of damaged organelles. However, EGCg suppressed autophagy signaling after reloading, potentially to increase the recovery of hindlimb muscles mass and function after loading is restored.

  2. System design and performances of ASTER Level-1 data processing

    NASA Astrophysics Data System (ADS)

    Nishida, Sumiyuki; Hachiya, Jun; Matsumoto, Ken; Fujisada, Hiroyuki; Kato, Masatane

    1998-12-01

    ASTER is a multispectral imager which covers wide spectral region from visible to thermal infrared with 14 spectral bands, and will fly on EOS-AM1 in 1999. To meet this wide spectral coverage, ASTER has three optical sensing subsystems (multi-telescope system), VNIR, SWIR and TIR. This multi- telescope configuration requires highly refined ground processing for the generation of Level-1 data products that are radiometrically calibrated and geometrically corrected. A prototype Level-1 processing software system is developed to satisfy these requirements. System design concept adopted includes; (1) 'Automatic Processing,' (2)'ALL-IN-ONE-CONCEPT' in which the processing is carried out using information included in Level-0 data product only, (3) 'MODULE INDEPENDENCE' in which only process control module independently control other modules to change any operational conditions. (4) 'FLEXIBILITY' in which important operation parameters are set from an external component to make the processing condition change easier. The adaptability and the performance of the developed software system are evaluated using simulation data.

  3. Lessons from nature: computational design of biomimetic compounds and processes.

    PubMed

    Bozkurt, Esra; Ashari, Negar; Browning, Nicholas; Brunk, Elizabeth; Campomanesa, Pablo; Perez, Marta A S; Rothlisberger, Ursula

    2014-09-01

    Through millions of years of evolution, Nature has accomplished the development of highly efficient and sustainable processes and the idea to understand and copy natural strategies is therefore very appealing. However, in spite of intense experimental and computational research, it has turned out to be a difficult task to design efficient biomimetic systems. Here we discuss a novel strategy for the computational design of biomimetic compounds and processes that consists of i) target selection; ii) atomistic and electronic characterization of the wild type system and the biomimetic compounds; iii) identification of key descriptors through feature selection iv) choice of biomimetic template and v) efficient search of chemical and sequence space for optimization of the biomimetic system. As a proof-of-principles study, this general approach is illustrated for the computational design of a 'green' catalyst mimicking the action of the zinc metalloenzyme Human Carbonic Anhydrase (HCA). HCA is a natural model for CO2 fixation since the enzyme is able to convert CO2 into bicarbonate. Very recently, a weakly active HCA mimic based on a trihelical peptide bundle was synthetized. We have used quantum mechanical/molecular mechanical (QM/MM) Car-Parrinello simulations to study the mechanisms of action of HCA and its peptidic mimic and employed the obtained information to guide the design of improved biomimetic analogues. Applying a genetic algorithm based optimization procedure, we were able to re-engineer and optimize the biomimetic system towards its natural counter part. In a second example, we discuss a similar strategy for the design of biomimetic sensitizers for use in dye-sensitized solar cells.

  4. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  5. Mechanical design and design processes for the Telescope Optical Assembly of the Optical Communications Demonstrator

    NASA Astrophysics Data System (ADS)

    von Lossberg, Bryan R.

    1994-08-01

    A mechanical design has been developed for the Telescope Optical Assembly (TOA) of the Optical Communications Demonstrator (OCD). The TOA is the portion of the OCD instrument that integrates all the optical elements of the system with the exception of the Laser Transmitter Assembly (LXA) which is fiber coupled to the TOA. The TOA structure is composed primarily of aluminum components with some use of steel and invar. The assembly is contained within a 16 cm MUL 20 cm X 33 cm envelope and has an estimated mass of 5.5 kg. The mechanical design was developed using Computervision's CADDS 5 computer aided design software. Code V optical design data was used as a primary input and was efficiently and accurately transferred form the optical designer to the mechanical designer through the use of IGES files. In addition to enabling rapid transfer of the initial optical design as well as subsequent optical design refinements, the IGES transfer process was also used to expedite preliminary thermal and dynamic analyses.

  6. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  7. Co-Simulation for Advanced Process Design and Optimization

    SciTech Connect

    Stephen E. Zitney

    2009-01-01

    Meeting the increasing demand for clean, affordable, and secure energy is arguably the most important challenge facing the world today. Fossil fuels can play a central role in a portfolio of carbon-neutral energy options provided CO{sub 2} emissions can be dramatically reduced by capturing CO{sub 2} and storing it safely and effectively. Fossil energy industry faces the challenge of meeting aggressive design goals for next-generation power plants with CCS. Process designs will involve large, highly-integrated, and multipurpose systems with advanced equipment items with complex geometries and multiphysics. APECS is enabling software to facilitate effective integration, solution, and analysis of high-fidelity process/equipment (CFD) co-simulations. APECS helps to optimize fluid flow and related phenomena that impact overall power plant performance. APECS offers many advanced capabilities including ROMs, design optimization, parallel execution, stochastic analysis, and virtual plant co-simulations. NETL and its collaborative R&D partners are using APECS to reduce the time, cost, and technical risk of developing high-efficiency, zero-emission power plants with CCS.

  8. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  9. Design of educational artifacts as support to learning process.

    PubMed

    Resende, Adson Eduardo; Vasconcelos, Flávio Henrique

    2012-01-01

    The aim of this paper is to identify utilization schemes developed by students and teachers in their interaction with educational workstations in the electronic measurement and instrumentation laboratory at the Department of Electrical Engineering in the Federal University of Minas Gerais (UFMG), Brazil. After that, these schemes were used to design a new workstation. For this, it was important to bear in mind that the mentioned artifacts contain two key characteristics: (1) one from the designers themselves, resulting from their experience and their technical knowledge of what they are designing and (2) the experience from users and the means through which they take advantage of and develop these artifacts, in turn rendering them appropriate to perform the proposed task - the utilization schemes developed in the process of mediation between the user and the artifact. The satisfactory fusion of these two points makes these artifacts a functional unit - the instruments. This research aims to demonstrate that identifying the utilization schemes by taking advantage of user experience and incorporating this within the design, facilitates its appropriation and, consequently, its efficiency as an instrument of learning.

  10. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  11. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

  12. Safeguards design strategies: designing and constructing new uranium and plutonium processing facilities in the United States

    SciTech Connect

    Scherer, Carolynn P; Long, Jon D

    2010-09-28

    In the United States, the Department of Energy (DOE) is transforming its outdated and oversized complex of aging nuclear material facilities into a smaller, safer, and more secure National Security Enterprise (NSE). Environmental concerns, worker health and safety risks, material security, reducing the role of nuclear weapons in our national security strategy while maintaining the capability for an effective nuclear deterrence by the United States, are influencing this transformation. As part of the nation's Uranium Center of Excellence (UCE), the Uranium Processing Facility (UPF) at the Y-12 National Security Complex in Oak Ridge, Tennessee, will advance the U.S.'s capability to meet all concerns when processing uranium and is located adjacent to the Highly Enriched Uranium Materials Facility (HEUMF), designed for consolidated storage of enriched uranium. The HEUMF became operational in March 2010, and the UPF is currently entering its final design phase. The designs of both facilities are for meeting anticipated security challenges for the 21st century. For plutonium research, development, and manufacturing, the Chemistry and Metallurgy Research Replacement (CMRR) building at the Los Alamos National Laboratory (LANL) in Los Alamos, New Mexico is now under construction. The first phase of the CMRR Project is the design and construction of a Radiological Laboratory/Utility/Office Building. The second phase consists of the design and construction of the Nuclear Facility (NF). The National Nuclear Security Administration (NNSA) selected these two sites as part of the national plan to consolidate nuclear materials, provide for nuclear deterrence, and nonproliferation mission requirements. This work examines these two projects independent approaches to design requirements, and objectives for safeguards, security, and safety (3S) systems as well as the subsequent construction of these modern processing facilities. Emphasis is on the use of Safeguards-by-Design (SBD

  13. Demo III processing architecture trades and preliminary design

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.; Cory, Phil; Peterman, Pete

    1999-01-01

    This paper will provide a summary of the methodology, metrics, analysis, and trade study efforts for the preliminary design o the Vetronics Processing Architecture (PA) system based on the Demo III Experimental Unmanned Ground Vehicle (XUV) program requirements. We will document and describe both the provided and analytically derived system requirements expressed by the proposal. Our experience based on previous mobility and Reconnaissance, Surveillance, Targeting, Acquisition systems designed and implemented for Demo II Semi-Autonomous Surrogate Vehicle and Mobile Detection, Assessment and Response System will be used to describe lessons learned as applied to the XUV in PA architecture, Single Board Computers, Card Cage Buses, Real-Time and Non Real-Time processor and Card Cage to Card Cage Communications, and Imaging and Radar pre-processors selection and choices. We have selected an initial architecture methodology.

  14. Waste receiving and processing plant control system; system design description

    SciTech Connect

    LANE, M.P.

    1999-02-24

    The Plant Control System (PCS) is a heterogeneous computer system composed of numerous sub-systems. The PCS represents every major computer system that is used to support operation of the Waste Receiving and Processing (WRAP) facility. This document, the System Design Description (PCS SDD), includes several chapters and appendices. Each chapter is devoted to a separate PCS sub-system. Typically, each chapter includes an overview description of the system, a list of associated documents related to operation of that system, and a detailed description of relevant system features. Each appendice provides configuration information for selected PCS sub-systems. The appendices are designed as separate sections to assist in maintaining this document due to frequent changes in system configurations. This document is intended to serve as the primary reference for configuration of PCS computer systems. The use of this document is further described in the WRAP System Configuration Management Plan, WMH-350, Section 4.1.

  15. A process for free-space laser communications system design

    NASA Astrophysics Data System (ADS)

    Walther, Frederick G.; Moores, John D.; Murphy, Robert J.; Michael, Steven; Nowak, George A.

    2009-08-01

    We present a design methodology for free-space laser communications systems. The first phase includes a characterization through numerical simulations of the channel to evaluate the range of extinction and scintillation. The second phase is the selection of fade mitigation schemes, which would incorporate pointing, acquisition, tracking, and communication system parameters specifically tailored to the channel. Ideally, the process would include sufficient flexibility to adapt to a wide range of channel conditions. We provide an example of the successful application of this design approach to a recent set of field experiments. This work was sponsored by the Department of Defense, RRCO DDR&E, under Air Force Contract FA8721-05-C-0002. Opinions, interpretations, conclusions and recommendations are those of the authors and are not necessarily endorsed by the United States Government.

  16. Process design of press hardening with gradient material property influence

    SciTech Connect

    Neugebauer, R.; Schieck, F.; Rautenstrauch, A.

    2011-05-04

    Press hardening is currently used in the production of automotive structures that require very high strength and controlled deformation during crash tests. Press hardening can achieve significant reductions of sheet thickness at constant strength and is therefore a promising technology for the production of lightweight and energy-efficient automobiles. The manganese-boron steel 22MnB5 have been implemented in sheet press hardening owing to their excellent hot formability, high hardenability, and good temperability even at low cooling rates. However, press-hardened components have shown poor ductility and cracking at relatively small strains. A possible solution to this problem is a selective increase of steel sheet ductility by press hardening process design in areas where the component is required to deform plastically during crash tests. To this end, process designers require information about microstructure and mechanical properties as a function of the wide spectrum of cooling rates and sequences and austenitizing treatment conditions that can be encountered in production environments. In the present work, a Continuous Cooling Transformation (CCT) diagram with corresponding material properties of sheet steel 22MnB5 was determined for a wide spectrum of cooling rates. Heating and cooling programs were conducted in a quenching dilatometer. Motivated by the importance of residual elasticity in crash test performance, this property was measured using a micro-bending test and the results were integrated into the CCT diagrams to complement the hardness testing results. This information is essential for the process design of press hardening of sheet components with gradient material properties.

  17. Design of experiments for thermal protection system process optimization

    NASA Astrophysics Data System (ADS)

    Longani, Hans R.

    2000-01-01

    Solid Rocket Booster (SRB) structures were protected from heating due to aeroshear, radiation and plume impingement by a Thermal Protection System (TPS) known as Marshall Sprayable Ablative (MSA-2). MSA-2 contains Volatile Organic Compounds (VOCs) which due to strict environmental legislation was eliminated. MSA-2 was also classified as hazardous waste, which makes the disposal very costly. Marshall Convergent Coating (MCC-1) replaced MSA-2, and eliminated the use of solvents by delivering the dry filler materials and the fluid resin system to a patented spray gun which utilizes Convergent Spray Technologies spray process. The selection of TPS material was based on risk assessment, performance comparisons, processing, application and cost. Design of Experiments technique was used to optimize the spraying parameters. .

  18. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  19. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-11-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H{sub 2} mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO{sub x} (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  20. Calderon coal gasification Process Development Unit design and test program

    SciTech Connect

    Calderon, A.; Madison, E.; Probert, P.

    1992-01-01

    The Process Development Unit (PDU) was designed and constructed to demonstrate the novel Calderon gasification/hot gas cleanup process. in the process, run-of-mine high sulfur coal is first pyrolyzed to recover a rich gas (medium Btu gas), after which the resulting char is subjected to airblown gasification to yield a lean gas (low Btu gas). The process incorporates a proprietary integrated system for the conversion of coal to gases and for the hot cleanup of the gases which removes both particulate and sulfur components of the gaseous products. The yields are: a syngas (CO and H[sub 2] mix) suitable for further conversion to liquid fuel (e.g. methanol/gasoline), and a lean gas suitable to fuel the combustion turbine of a combined cycle power generation plant with very low levels of NO[sub x] (15 ppmv). The fused slag (from the gasified char ash content) and the sulfur recovered during the hot gas cleanup will be sold as by-products. The small quantity of spent sorbent generated will be combined with the coal feed as a fluxing agent for the slag. The small quantity of wastewater from slag drainings and steam generation blowdown will be mixed with the coal feed for disposal. The Calderon gasification/hot gas cleanup, which is a completely closed system, operates at a pressure suitable for combined cycle power generation.

  1. [Design of an HACCP program for a cocoa processing facility].

    PubMed

    López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar

    2012-12-01

    The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well

  2. Using instructional design process to improve design and development of Internet interventions.

    PubMed

    Hilgart, Michelle M; Ritterband, Lee M; Thorndike, Frances P; Kinzie, Mable B

    2012-06-28

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  3. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  4. Design, processing, and testing of lsi arrays for space station

    NASA Technical Reports Server (NTRS)

    Lile, W. R.; Hollingsworth, R. J.

    1972-01-01

    The design of a MOS 256-bit Random Access Memory (RAM) is discussed. Technological achievements comprise computer simulations that accurately predict performance; aluminum-gate COS/MOS devices including a 256-bit RAM with current sensing; and a silicon-gate process that is being used in the construction of a 256-bit RAM with voltage sensing. The Si-gate process increases speed by reducing the overlap capacitance between gate and source-drain, thus reducing the crossover capacitance and allowing shorter interconnections. The design of a Si-gate RAM, which is pin-for-pin compatible with an RCA bulk silicon COS/MOS memory (type TA 5974), is discussed in full. The Integrated Circuit Tester (ICT) is limited to dc evaluation, but the diagnostics and data collecting are under computer control. The Silicon-on-Sapphire Memory Evaluator (SOS-ME, previously called SOS Memory Exerciser) measures power supply drain and performs a minimum number of tests to establish operation of the memory devices. The Macrodata MD-100 is a microprogrammable tester which has capabilities of extensive testing at speeds up to 5 MHz. Beam-lead technology was successfully integrated with SOS technology to make a simple device with beam leads. This device and the scribing are discussed.

  5. Preliminary Process Design of ITER ELM Coil Bracket Brazing

    NASA Astrophysics Data System (ADS)

    LI, Xiangbin; SHI, Yi

    2015-03-01

    With the technical requirement of the International Thermonuclear Experimental Reactor (ITER) project, the manufacture and assembly technology of the mid Edge Localized Modes (ELM) coil was developed by the Institute of Plasma Physics, Chinese Academy of Science (ASIPP). As the gap between the bracket and the Stainless Steel jacketed and Mineral Insulated Conductor (SSMIC) can be larger than 0.5 mm instead of 0.01 mm to 0.1 mm as in normal industrial cases, the process of mid ELM coil bracket brazing to the SSMICT becomes quiet challenging, from a technical viewpoint. This paper described the preliminary design of ELM coil bracket brazing to the SSMIC process, the optimal bracket brazing curve and the thermal simulation of the bracket furnace brazing method developed by ANSYS. BAg-6 foil (Bag50Cu34Zn16) plus BAg-1a paste (Bag45CuZnCd) solders were chosen as the brazing filler. By testing an SSMICT prototype, it is shown that the average gap between the bracket and the SSMIC could be controlled to 0.2-0.3 mm, and that there were few voids in the brazing surface. The results also verified that the preliminary design had a favorable heat conducting performance in the bracket.

  6. Optimizing product life cycle processes in design phase

    NASA Astrophysics Data System (ADS)

    Faneye, Ola. B.; Anderl, Reiner

    2002-02-01

    Life cycle concepts do not only serve as basis in assisting product developers understand the dependencies between products and their life cycles, they also help in identifying potential opportunities for improvement in products. Common traditional concepts focus mainly on energy and material flow across life phases, necessitating the availability of metrics derived from a reference product. Knowledge of life cycle processes won from an existing product is directly reused in its redesign. Depending on sales volume nevertheless, the environmental impact before product optimization can be substantial. With modern information technologies today, computer-aided life cycle methodologies can be applied well before product use. On the basis of a virtual prototype, life cycle processes are analyzed and optimized, using simulation techniques. This preventive approach does not only help in minimizing (or even eliminating) environmental burdens caused by product, costs incurred due to changes in real product can also be avoided. The paper highlights the relationship between product and life cycle and presents a computer-based methodology for optimizing the product life cycle during design, as presented by SFB 392: Design for Environment - Methods and Tools at Technical University, Darmstadt.

  7. Integrating optical fabrication and metrology into the optical design process.

    PubMed

    Harvey, James E

    2015-03-20

    The recent validation of a generalized linear systems formulation of surface scatter theory and an analysis of image degradation due to surface scatter in the presence of aberrations has provided credence to the development of a systems engineering analysis of image quality as degraded not only by diffraction effects and geometrical aberrations, but to scattering effects due to residual optical fabrication errors as well. This generalized surface scatter theory provides insight and understanding by characterizing surface scatter behavior with a surface transfer function closely related to the modulation transfer function of classical image formation theory. Incorporating the inherently band-limited relevant surface roughness into the surface scatter theory provides mathematical rigor into surface scatter analysis, and implementing a fast Fourier transform algorithm with logarithmically spaced data points facilitates the practical calculation of scatter behavior from surfaces with a large dynamic range of relevant spatial frequencies. These advances, combined with the continuing increase in computer speed, leave the optical design community in a position to routinely derive the optical fabrication tolerances necessary to satisfy specific image quality requirements during the design phase of a project; i.e., to integrate optical metrology and fabrication into the optical design process.

  8. Design process of an area-efficient photobioreactor.

    PubMed

    Zijffers, Jan-Willem F; Janssen, Marcel; Tramper, Johannes; Wijffels, René H

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved.

  9. Simulative design and process optimization of the two-stage stretch-blow molding process

    SciTech Connect

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  10. Simulative design and process optimization of the two-stage stretch-blow molding process

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  11. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  12. Climate Monitoring Satellite Designed in a Concurrent Engineering Process

    NASA Astrophysics Data System (ADS)

    Bauer, Waldemar; Braukhane, A.; Quantius, D.; Dumont, E.; Grundmann, J. T.; Romberg, O.

    An effective method of detecting Green House Gases (GHG CO2 and CH4) is using satellites, operating in Low Earth Orbit (LEO). Satellite based greenhouse gas emissions monitoring is challenging and shows an ambitions level of requirements. Until now for corresponding scientific payload it is common to use a purpose-built satellite bus, or to install the payload on board of a larger conventional satellite. These approaches fulfils all customer requirements but could be critical from a financial point of view. Between 2014 and 2020, no space-based CH4 detection and if at all limited CO2 detection capabilities are planned internationally. In order to fill this gap the Institute for Environmental Physics (IUP) of the University of Bremen plans a GHG satellite mission with near-surface sensitivity called "CarbonSat". It shall perform synchronous global atmospheric CO2 and CH4 observations with the accuracy, precision and coverage needed to significantly advance our knowledge about the sources and sinks of Green House Gases. In order to verify technical and financial opportunities of a small satellite a Concurrent Engi-neering Study (CE-study) has been performed at DLR Bremen, Germany. To reuse knowledge in compact satellite design, the Compact/SSB (Standard Satellite Bus) was chosen as baseline design. The SSB has been developed by DLR and was already used for BIRD (Bispectral Infra-Red Detection) mission but also adapted to the ongoing missions like TET (Technologie-Erprobungs-Trüger) or AsteroidFinder. This paper deals with the highly effective design process a within the DLR-CE-Facility and with the outcomes of the CE-study. It gives an overview of the design status as well as an outlook for comparable missions.

  13. Space Shuttle Ascent Flight Design Process: Evolution and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Picka, Bret A.; Glenn, Christopher B.

    2011-01-01

    The Space Shuttle Ascent Flight Design team is responsible for defining a launch to orbit trajectory profile that satisfies all programmatic mission objectives and defines the ground and onboard reconfiguration requirements for this high-speed and demanding flight phase. This design, verification and reconfiguration process ensures that all applicable mission scenarios are enveloped within integrated vehicle and spacecraft certification constraints and criteria, and includes the design of the nominal ascent profile and trajectory profiles for both uphill and ground-to-ground aborts. The team also develops a wide array of associated training, avionics flight software verification, onboard crew and operations facility products. These key ground and onboard products provide the ultimate users and operators the necessary insight and situational awareness for trajectory dynamics, performance and event sequences, abort mode boundaries and moding, flight performance and impact predictions for launch vehicle stages for use in range safety, and flight software performance. These products also provide the necessary insight to or reconfiguration of communications and tracking systems, launch collision avoidance requirements, and day of launch crew targeting and onboard guidance, navigation and flight control updates that incorporate the final vehicle configuration and environment conditions for the mission. Over the course of the Space Shuttle Program, ascent trajectory design and mission planning has evolved in order to improve program flexibility and reduce cost, while maintaining outstanding data quality. Along the way, the team has implemented innovative solutions and technologies in order to overcome significant challenges. A number of these solutions may have applicability to future human spaceflight programs.

  14. Preconceptual design of a salt splitting process using ceramic membranes

    SciTech Connect

    Kurath, D.E.; Brooks, K.P.; Hollenberg, G.W.; Clemmer, R.; Balagopal, S.; Landro, T.; Sutija, D.P.

    1997-01-01

    Inorganic ceramic membranes for salt splitting of radioactively contaminated sodium salt solutions are being developed for treating U. S. Department of Energy tank wastes. The process consists of electrochemical separation of sodium ions from the salt solution using sodium (Na) Super Ion Conductors (NaSICON) membranes. The primary NaSICON compositions being investigated are based on rare- earth ions (RE-NaSICON). Potential applications include: caustic recycling for sludge leaching, regenerating ion exchange resins, inhibiting corrosion in carbon-steel tanks, or retrieving tank wastes; reducing the volume of low-level wastes volume to be disposed of; adjusting pH and reducing competing cations to enhance cesium ion exchange processes; reducing sodium in high-level-waste sludges; and removing sodium from acidic wastes to facilitate calcining. These applications encompass wastes stored at the Hanford, Savannah River, and Idaho National Engineering Laboratory sites. The overall project objective is to supply a salt splitting process unit that impacts the waste treatment and disposal flowsheets and meets user requirements. The potential flowsheet impacts include improving the efficiency of the waste pretreatment processes, reducing volume, and increasing the quality of the final waste disposal forms. Meeting user requirements implies developing the technology to the point where it is available as standard equipment with predictable and reliable performance. This report presents two preconceptual designs for a full-scale salt splitting process based on the RE-NaSICON membranes to distinguish critical items for testing and to provide a vision that site users can evaluate.

  15. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  16. Heat and power networks in process design, part II, design procedure for equipment selection and process matching

    SciTech Connect

    Townsend, D.W.; Linnhoff, B.

    1983-09-01

    In Part I, criteria for heat engine and heat pump placement in chemical process networks were derived, based on the ''temperature interval'' (T.I) analysis of the heat exchanger network problem. Using these criteria, this paper gives a method for identifying the best outline design for any combined system of chemical process, heat engines, and heat pumps. The method eliminates inferior alternatives early, and positively leads on to the most appropriate solution. A graphical procedure based on the T.I. analysis forms the heart of the approach, and the calculations involved are simple enough to be carried out on, say, a programmable calculator. Application to a case study is demonstrated. Optimization methods based on this procedure are currently under research.

  17. Materials, design and processing of air encapsulated MEMS packaging

    NASA Astrophysics Data System (ADS)

    Fritz, Nathan T.

    This work uses a three-dimensional air cavity technology to improve the fabrication, and functionality of microelectronics devices, performance of on-board transmission lines, and packaging of micro-electromechanical systems (MEMS). The air cavity process makes use of the decomposition of a patterned sacrificial polymer followed by the diffusion of its by-products through a curing polymer overcoat to obtain the embedded air structure. Applications and research of air cavities have focused on simple designs that concentrate on the size and functionality of the particular device. However, a lack of guidelines for fabrication, materials used, and structural design has led to mechanical stability issues and processing refinements. This work investigates improved air gap cavities for use in MEMS packaging processes, resulting in fewer fabrication flaws and lower cost. The identification of new materials, such as novel photo-definable organic/inorganic hybrid polymers, was studied for increased strength and rigidity due to their glass-like structure. A novel epoxy polyhedral oligomeric silsesquioxane (POSS) material was investigated and characterized for use as a photodefineable, permanent dielectrics with improved mechanical properties. The POSS material improved the air gap fabrication because it served as a high-selectivity etch mask for patterning sacrificial materials as well as a cavity overcoat material with improved rigidity. An investigation of overcoat thickness and decomposition kinetics provided a fundamental understanding of the properties that impart mechanical stability to cavities of different shape and volume. Metallization of the cavities was investigated so as to provide hermetic sealing and improved cavity strength. The improved air cavity, wafer-level packages were tested using resonator-type devices and chip-level lead frame packaging. The air cavity package was molded under traditional lead frame molding pressures and tested for mechanical

  18. From Safe Nanomanufacturing to Nanosafe-by-Design processes

    NASA Astrophysics Data System (ADS)

    Schuster, F.; Lomello, F.

    2013-04-01

    Industrial needs in terms of multifunctional components are increasing. Many sectors are concerned, from the integrated direct nanoparticles production to the emerging combinations which include the metal matrix composites (MMC), ductile ceramics and ceramic matrix composites, polymer matrix composites (PMC) for bulk application and advanced surface coatings in the fields of automotive, aerospace, energy production and building applications. Moreover, domains with a planetary impact such as environmental issues, as well as aspects for instance health (toxicity) and hazard assessment (ignition and explosion severity) were also taken into account. Nanotechnologies play an important role in promoting innovation in design and realization of multifunctional products for the future, either by improving usual products or creating new functions and/or new products. Nevertheless, this huge evolution in terms of materials could only be promoted by increasing the social acceptance and by acting on the different main technological and economic challenges and developing safe oriented processes. Nowadays, a huge number of developments of nanoparticles are potentially industrial up-scalable. However, some doubts exist about the handling's safety of the current technologies. For these reasons, the main purpose was to develop a self-monitored automation in the production line coupling different techniques in order to simplify processes such as in-situ growth nanoparticles into a nanostructured matrix, over different substrates and/or the nanopowders synthesis, functionalization, dry or wet safe recovery system, granulation, consolidation in single-step, by monitoring at real time the processing parameters such as powder stoichiometry. With the aim of assuring the traceability of the product during the whole life, starting from the conception and including the R&D, the distribution and the use were also considered. The optimization in terms of processing, recovery and conditioning

  19. On the optimal design of the disassembly and recovery processes.

    PubMed

    Xanthopoulos, A; Iakovou, E

    2009-05-01

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  20. Superior metallic alloys through rapid solidification processing (RSP) by design

    SciTech Connect

    Flinn, J.E.

    1995-05-01

    Rapid solidification processing using powder atomization methods and the control of minor elements such as oxygen, nitrogen, and carbon can provide metallic alloys with superior properties and performance compared to conventionally processing alloys. Previous studies on nickel- and iron-base superalloys have provided the baseline information to properly couple RSP with alloy composition, and, therefore, enable alloys to be designed for performance improvements. The RSP approach produces powders, which need to be consolidated into suitable monolithic forms. This normally involves canning, consolidation, and decanning of the powders. Canning/decanning is expensive and raises the fabrication cost significantly above that of conventional, ingot metallurgy production methods. The cost differential can be offset by the superior performance of the RSP metallic alloys. However, without the performance database, it is difficult to convince potential users to adopt the RSP approach. Spray casting of the atomized molten droplets into suitable preforms for subsequent fabrication can be cost competitive with conventional processing. If the fine and stable microstructural features observed for the RSP approach are preserved during spray casing, a cost competitive product can be obtained that has superior properties and performance that cannot be obtained by conventional methods.

  1. Data Quality Objectives Process for Designation of K Basins Debris

    SciTech Connect

    WESTCOTT, J.L.

    2000-05-22

    The U.S. Department of Energy has developed a schedule and approach for the removal of spent fuels, sludge, and debris from the K East (KE) and K West (KW) Basins, located in the 100 Area at the Hanford Site. The project that is the subject of this data quality objective (DQO) process is focused on the removal of debris from the K Basins and onsite disposal of the debris at the Environmental Restoration Disposal Facility (ERDF). This material previously has been dispositioned at the Hanford Low-Level Burial Grounds (LLBGs) or Central Waste Complex (CWC). The goal of this DQO process and the resulting Sampling and Analysis Plan (SAP) is to provide the strategy for characterizing and designating the K-Basin debris to determine if it meets the Environmental Restoration Disposal Facility Waste Acceptance Criteria (WAC), Revision 3 (BHI 1998). A critical part of the DQO process is to agree on regulatory and WAC interpretation, to support preparation of the DQO workbook and SAP.

  2. Designing quantum information processing via structural physical approximation

    NASA Astrophysics Data System (ADS)

    Bae, Joonwoo

    2017-10-01

    In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.

  3. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  4. Designing quantum information processing via structural physical approximation.

    PubMed

    Bae, Joonwoo

    2017-10-01

    In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.

  5. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    NASA Astrophysics Data System (ADS)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  6. Tools for efficient design of multicomponent separation processes

    NASA Astrophysics Data System (ADS)

    Huff, Joshua Lee

    formulation and the relative effect of capital and operating cost is weighed for an example feed. Previous methods based on Underwood's equations have no accounting for the temperature at which utilities are required. To account for this, a thermodynamic efficiency function is developed which allows the complete search space to be ranklisted in order of the exergy loss occurring within the configuration. Examining these results shows that this objective function favors configurations which move their reboiler and condenser duties to milder temperature exchangers. A graphical interface is presented which allows interpretation of any of the above results in a quick and intuitive fashion, complete with system flow and composition data and the ability to filter the complete search space based on numerical and structural criteria. This provides a unique way to compare and contrast configurations as well as allowing considerations like column retrofit and maximum controllability to be considered. Using all five of these screening techniques, the traditional intuition-based methods of separations process design can be augmented with analytical and algorithmic tools which enable selection of a process design with low cost and high efficiency.

  7. Conceptual Design for the Pilot-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Jones, Susan A.; Rapko, Brian M.

    2014-08-05

    This report describes a conceptual design for a pilot-scale capability to produce plutonium oxide for use as exercise and reference materials, and for use in identifying and validating nuclear forensics signatures associated with plutonium production. This capability is referred to as the Pilot-scale Plutonium oxide Processing Unit (P3U), and it will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including plutonium dioxide (PuO2) dissolution, purification of the Pu by ion exchange, precipitation, and conversion to oxide by calcination.

  8. Process and Prospects for the Designed Hydrograph, Lower Missouri River

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Galat, D. L.; Hay, C. H.

    2005-05-01

    The flow regime of the Lower Missouri River (LMOR, Gavins Point, SD to St. Louis, MO) is being redesigned to restore elements of natural variability while maintaining project purposes such as power production, flood control, water supply, and navigation. Presently, an experimental hydrograph alteration is planned for Spring, 2006. Similar to many large, multi-purpose rivers, the ongoing design process involves negotiation among many management and stakeholder groups. The negotiated process has simplified the hydrograph into two key elements -- the spring rise and the summer low - with emphasis on the influence of these elements on three threatened or endangered species. The spring rise has been hypothesized to perform three functions: build sandbars for nesting of the interior least tern and piping plover, provide episodic connectivity with low-lying flood plain, and provide a behavioral spawning cue for the pallid sturgeon. Among these, most emphasis has been placed on the spawning cue because concerns about downstream flood hazards have limited flow magnitudes to those that are thought to be geomorphically ineffective, and channelization and incision provide little opportunity for moderate flows to connect to the flood plain. Our analysis of the natural hydrologic regime provides some insight into possible spring rise design elements, including timing, rate of rise and fall, and length of spring flow pulses. The summer low has been hypothesized to emerge sandbars for nesting and to maximize area of shallow, slow water for rearing of larval and juvenile fish. Re-engineering of the navigation channel to provide greater diversity of habitat during navigation flows has been offered as an alternative to the summer low. Our analysis indicates that re-engineering has potential to increase habitat availability substantially, but the ecological results are so-far unknown. The designed hydrograph that emerges from the multi-objective process will likely represent a

  9. Process design and evaluation of production of bioethanol and β-lactam antibiotic from lignocellulosic biomass.

    PubMed

    Kim, Sung Bong; Park, Chulhwan; Kim, Seung Wook

    2014-11-01

    To design biorefinery processes producing bioethanol from lignocellulosic biomass with dilute acid pretreatment, biorefinery processes were simulated using the SuperPro Designer program. To improve the efficiency of biomass use and the economics of biorefinery, additional pretreatment processes were designed and evaluated, in which a combined process of dilute acid and aqueous ammonia pretreatments, and a process of waste media containing xylose were used, for the production of 7-aminocephalosporanic acid. Finally, the productivity and economics of the designed processes were compared.

  10. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use

  11. fMRI paradigm designing and post-processing tools

    PubMed Central

    James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  12. Identifying User Needs and the Participative Design Process

    NASA Astrophysics Data System (ADS)

    Meiland, Franka; Dröes, Rose-Marie; Sävenstedt, Stefan; Bergvall-Kåreborn, Birgitta; Andersson, Anna-Lena

    As the number of persons with dementia increases and also the demands on care and support at home, additional solutions to support persons with dementia are needed. The COGKNOW project aims to develop an integrated, user-driven cognitive prosthetic device to help persons with dementia. The project focuses on support in the areas of memory, social contact, daily living activities and feelings of safety. The design process is user-participatory and consists of iterative cycles at three test sites across Europe. In the first cycle persons with dementia and their carers (n = 17) actively participated in the developmental process. Based on their priorities of needs and solutions, on their disabilities and after discussion between the team, a top four list of Information and Communication Technology (ICT) solutions was made and now serves as the basis for development: in the area of remembering - day and time orientation support, find mobile service and reminding service, in the area of social contact - telephone support by picture dialling, in the area of daily activities - media control support through a music playback and radio function, and finally, in the area of safety - a warning service to indicate when the front door is open and an emergency contact service to enhance feelings of safety. The results of this first project phase show that, in general, the people with mild dementia as well as their carers were able to express and prioritize their (unmet) needs, and the kind of technological assistance they preferred in the selected areas. In next phases it will be tested if the user-participatory design and multidisciplinary approach employed in the COGKNOW project result in a user-friendly, useful device that positively impacts the autonomy and quality of life of persons with dementia and their carers.

  13. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  14. Design Process of an Area-Efficient Photobioreactor

    PubMed Central

    Janssen, Marcel; Tramper, Johannes; Wijffels, René H.

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such intensities that the light energy can be efficiently used for biomass formation. A statement of goals is formulated and constraints are specified to which the GSC needs to comply. Specifications are generated for a prototype which form and function achieve the stated goals and satisfy the specified constraints. This results in a design in which sunlight is captured into vertical plastic light guides. Sunlight reflects internally in the guide and eventually scatters out of the light guide into flat-panel photobioreactor compartments. Sunlight is focused on top of the light guides by dual-axis positioning of linear Fresnel lenses. The shape and material of the light guide is such that light is maintained in the guides when surrounded by air. The bottom part of a light guide is sandblasted to obtain a more uniform distribution of light inside the bioreactor compartment and is triangular shaped to ensure the efflux of all light out of the guide. Dimensions of the guide are such that light enters the flat-panel photobioreactor compartment at intensities that can be efficiently used by the biomass present. The integration of light capturing, transportation, distribution and usage is such that high biomass productivities per area can be achieved. PMID:18266033

  15. Experimental design for dynamics identification of cellular processes.

    PubMed

    Dinh, Vu; Rundell, Ann E; Buzzard, Gregery T

    2014-03-01

    We address the problem of using nonlinear models to design experiments to characterize the dynamics of cellular processes by using the approach of the Maximally Informative Next Experiment (MINE), which was introduced in W. Dong et al. (PLoS ONE 3(8):e3105, 2008) and independently in M.M. Donahue et al. (IET Syst. Biol. 4:249-262, 2010). In this approach, existing data is used to define a probability distribution on the parameters; the next measurement point is the one that yields the largest model output variance with this distribution. Building upon this approach, we introduce the Expected Dynamics Estimator (EDE), which is the expected value using this distribution of the output as a function of time. We prove the consistency of this estimator (uniform convergence to true dynamics) even when the chosen experiments cluster in a finite set of points. We extend this proof of consistency to various practical assumptions on noisy data and moderate levels of model mismatch. Through the derivation and proof, we develop a relaxed version of MINE that is more computationally tractable and robust than the original formulation. The results are illustrated with numerical examples on two nonlinear ordinary differential equation models of biomolecular and cellular processes.

  16. Process Design of Aluminum Tailor Heat Treated Blanks.

    PubMed

    Kahrimanidis, Alexander; Lechner, Michael; Degner, Julia; Wortberg, Daniel; Merklein, Marion

    2015-12-09

    In many industrials field, especially in the automotive sector, there is a trend toward lightweight constructions in order to reduce the weight and thereby the CO₂ and NOx emissions of the products. An auspicious approach within this context is the substitution of conventional deep drawing steel by precipitation hardenable aluminum alloys. However, based on the low formability, the application for complex stamping parts is challenging. Therefore, at the Institute of Manufacturing Technology, an innovative technology to enhance the forming limit of these lightweight materials was invented. The key idea of the so-called Tailor Heat Treated Blanks (THTB) is optimization of the mechanical properties by local heat treatment before the forming operation. An accurate description of material properties is crucial to predict the forming behavior of tailor heat treated blanks by simulation. Therefore, within in this research project, a holistic approach for the design of the THTB process in dependency of the main influencing parameters is presented and discussed in detail. The capability of the approach for the process development of complex forming operations is demonstrated by a comparison of local blank thickness of a tailgate with the corresponding results from simulation.

  17. Process Design of Aluminum Tailor Heat Treated Blanks

    PubMed Central

    Kahrimanidis, Alexander; Lechner, Michael; Degner, Julia; Wortberg, Daniel; Merklein, Marion

    2015-01-01

    In many industrials field, especially in the automotive sector, there is a trend toward lightweight constructions in order to reduce the weight and thereby the CO2 and NOx emissions of the products. An auspicious approach within this context is the substitution of conventional deep drawing steel by precipitation hardenable aluminum alloys. However, based on the low formability, the application for complex stamping parts is challenging. Therefore, at the Institute of Manufacturing Technology, an innovative technology to enhance the forming limit of these lightweight materials was invented. The key idea of the so-called Tailor Heat Treated Blanks (THTB) is optimization of the mechanical properties by local heat treatment before the forming operation. An accurate description of material properties is crucial to predict the forming behavior of tailor heat treated blanks by simulation. Therefore, within in this research project, a holistic approach for the design of the THTB process in dependency of the main influencing parameters is presented and discussed in detail. The capability of the approach for the process development of complex forming operations is demonstrated by a comparison of local blank thickness of a tailgate with the corresponding results from simulation. PMID:28793727

  18. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  19. HYBRID SULFUR PROCESS REFERENCE DESIGN AND COST ANALYSIS

    SciTech Connect

    Gorensek, M.; Summers, W.; Boltrunis, C.; Lahoda, E.; Allen, D.; Greyvenstein, R.

    2009-05-12

    This report documents a detailed study to determine the expected efficiency and product costs for producing hydrogen via water-splitting using energy from an advanced nuclear reactor. It was determined that the overall efficiency from nuclear heat to hydrogen is high, and the cost of hydrogen is competitive under a high energy cost scenario. It would require over 40% more nuclear energy to generate an equivalent amount of hydrogen using conventional water-cooled nuclear reactors combined with water electrolysis compared to the proposed plant design described herein. There is a great deal of interest worldwide in reducing dependence on fossil fuels, while also minimizing the impact of the energy sector on global climate change. One potential opportunity to contribute to this effort is to replace the use of fossil fuels for hydrogen production by the use of water-splitting powered by nuclear energy. Hydrogen production is required for fertilizer (e.g. ammonia) production, oil refining, synfuels production, and other important industrial applications. It is typically produced by reacting natural gas, naphtha or coal with steam, which consumes significant amounts of energy and produces carbon dioxide as a byproduct. In the future, hydrogen could also be used as a transportation fuel, replacing petroleum. New processes are being developed that would permit hydrogen to be produced from water using only heat or a combination of heat and electricity produced by advanced, high temperature nuclear reactors. The U.S. Department of Energy (DOE) is developing these processes under a program known as the Nuclear Hydrogen Initiative (NHI). The Republic of South Africa (RSA) also is interested in developing advanced high temperature nuclear reactors and related chemical processes that could produce hydrogen fuel via water-splitting. This report focuses on the analysis of a nuclear hydrogen production system that combines the Pebble Bed Modular Reactor (PBMR), under development by

  20. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... 23 Highways 1 2014-04-01 2014-04-01 false How does the NEPA process relate to the design-build... TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.109 How does the NEPA...

  1. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... 23 Highways 1 2012-04-01 2012-04-01 false How does the NEPA process relate to the design-build... TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.109 How does the NEPA...

  2. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... design and construction for any projects, or portions thereof, for which the NEPA process has been... contracting agency; (8) The design-builder may be requested to provide information about the project and... 23 Highways 1 2013-04-01 2013-04-01 false How does the NEPA process relate to the design-build...

  3. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  4. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  5. Universal Design in Postsecondary Education: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design, "is…

  6. 23 CFR 636.109 - How does the NEPA process relate to the design-build procurement process?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING General § 636.109 How does the NEPA process relate to the design-build procurement process? The purpose of this section is to ensure that... 23 Highways 1 2011-04-01 2011-04-01 false How does the NEPA process relate to the...

  7. Singlet oxygen sensitizing materials based on porous silicone: photochemical characterization, effect of dye reloading and application to water disinfection with solar reactors.

    PubMed

    Manjón, Francisco; Santana-Magaña, Montserrat; García-Fresnadillo, David; Orellana, Guillermo

    2010-06-01

    Photogeneration of singlet molecular oxygen ((1)O(2)) is applied to organic synthesis (photooxidations), atmosphere/water treatment (disinfection), antibiofouling materials and in photodynamic therapy of cancer. In this paper, (1)O(2) photosensitizing materials containing the dyes tris(4,4'-diphenyl-2,2'-bipyridine)ruthenium(II) (1, RDB(2+)) or tris(4,7-diphenyl-1,10-phenanthroline)ruthenium(II) (2, RDP(2+)), immobilized on porous silicone (abbreviated RDB/pSil and RDP/pSil), have been produced and tested for waterborne Enterococcus faecalis inactivation using a laboratory solar simulator and a compound parabolic collector (CPC)-based solar photoreactor. In order to investigate the feasibility of its reuse, the sunlight-exposed RDP/pSil sensitizing material (RDP/pSil-a) has been reloaded with RDP(2+) (RDP/pSil-r). Surprisingly, results for bacteria inactivation with the reloaded material have demonstrated a 4-fold higher efficiency compared to those of either RDP/pSil-a, unused RDB/pSil and the original RDP/pSil. Surface and bulk photochemical characterization of the new material (RDP/pSil-r) has shown that the bactericidal efficiency enhancement is due to aggregation of the silicone-supported photosensitizer on the surface of the polymer, as evidenced by confocal fluorescence lifetime imaging microscopy (FLIM). Photogenerated (1)O(2) lifetimes in the wet sensitizer-doped silicone have been determined to be ten times longer than in water. These facts, together with the water rheology in the solar reactor and the interfacial production of the biocidal species, account for the more effective disinfection observed with the reloaded photosensitizing material. These results extend and improve the operational lifetime of photocatalytic materials for point-of-use (1)O(2)-mediated solar water disinfection.

  8. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  9. Are seat design processes of students similar to those of professionals?

    PubMed

    Kok, Barbara N E; Slegers, Karin; Vink, Peter

    2016-07-19

    Designers develop their basic competences during their design education and these competences are later amplified and refined during their professional career. Therefore, one could expect that the design processes of professionals and of student designers are conducted differently, and that these processes consist of different components (steps, actions, methods, tools, etc. used in the design process). The differences and similarities between the design processes of design students and professionals were studied. In addition the effect of the designers experience on the design process is studied. The design processes of seating products of 19 professional designers, 15 master students and 16 bachelor students were compared in order to understand the differences in the components they apply in their design process. The results showed significant differences between professional designers and design students for eight out of fifteen of the components. The components for which differences were found were applied more frequently by professionals than by students. For six of the components significant positive correlations were found with the designer's experience. There are significant differences between the design processes of design students as well as similarities. The differences amongst others related to the designers' experience.

  10. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  11. Design of the Laboratory-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    SciTech Connect

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Orton, Robert D.; Rapko, Brian M.; Smart, John E.

    2015-05-01

    This report describes a design for a laboratory-scale capability to produce plutonium oxide (PuO2) for use in identifying and validating nuclear forensics signatures associated with plutonium production, as well as for use as exercise and reference materials. This capability will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including PuO2 dissolution, purification of the Pu by ion exchange, precipitation, and re-conversion to PuO2 by calcination.

  12. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  13. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  14. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  15. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  16. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  17. Type-2 fuzzy model based controller design for neutralization processes.

    PubMed

    Kumbasar, Tufan; Eksin, Ibrahim; Guzelkaya, Mujde; Yesil, Engin

    2012-03-01

    In this study, an inverse controller based on a type-2 fuzzy model control design strategy is introduced and this main controller is embedded within an internal model control structure. Then, the overall proposed control structure is implemented in a pH neutralization experimental setup. The inverse fuzzy control signal generation is handled as an optimization problem and solved at each sampling time in an online manner. Although, inverse fuzzy model controllers may produce perfect control in perfect model match case and/or non-existence of disturbances, this open loop control would not be sufficient in the case of modeling mismatches or disturbances. Therefore, an internal model control structure is proposed to compensate these errors in order to overcome this deficiency where the basic controller is an inverse type-2 fuzzy model. This feature improves the closed-loop performance to disturbance rejection as shown through the real-time control of the pH neutralization process. Experimental results demonstrate the superiority of the inverse type-2 fuzzy model controller structure compared to the inverse type-1 fuzzy model controller and conventional control structures. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  19. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  20. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas