Science.gov

Sample records for process modeling design

  1. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  2. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  3. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  4. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  5. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design.

  6. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design. PMID:27492085

  7. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  8. Integrating Science into Design Technology Projects: Using a Standard Model in the Design Process.

    ERIC Educational Resources Information Center

    Zubrowski, Bernard

    2002-01-01

    Fourth graders built a model windmill using a three-step process: (1) open exploration of designs; (2) application of a standard model incorporating features of suggested designs; and (3) refinement of preliminary models. The approach required math, science, and technology teacher collaboration and adequate time. (Contains 21 references.) (SK)

  9. Type-2 fuzzy model based controller design for neutralization processes.

    PubMed

    Kumbasar, Tufan; Eksin, Ibrahim; Guzelkaya, Mujde; Yesil, Engin

    2012-03-01

    In this study, an inverse controller based on a type-2 fuzzy model control design strategy is introduced and this main controller is embedded within an internal model control structure. Then, the overall proposed control structure is implemented in a pH neutralization experimental setup. The inverse fuzzy control signal generation is handled as an optimization problem and solved at each sampling time in an online manner. Although, inverse fuzzy model controllers may produce perfect control in perfect model match case and/or non-existence of disturbances, this open loop control would not be sufficient in the case of modeling mismatches or disturbances. Therefore, an internal model control structure is proposed to compensate these errors in order to overcome this deficiency where the basic controller is an inverse type-2 fuzzy model. This feature improves the closed-loop performance to disturbance rejection as shown through the real-time control of the pH neutralization process. Experimental results demonstrate the superiority of the inverse type-2 fuzzy model controller structure compared to the inverse type-1 fuzzy model controller and conventional control structures. PMID:22036014

  10. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  11. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  12. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  13. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  14. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  15. Aerospace structural design process improvement using systematic evolutionary structural modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Robert Michael

    2000-10-01

    A multidisciplinary team tasked with an aircraft design problem must understand the problem requirements and metrics to produce a successful design. This understanding entails not only knowledge of what these requirements and metrics are, but also how they interact, which are most important (to the customer as well as to aircraft performance), and who in the organization can provide pertinent knowledge for each. In recent years, product development researchers and organizations have developed and successfully applied a variety of tools such as Quality Function Deployment (QFD) to coordinate multidisciplinary team members. The effectiveness of these methods, however, depends on the quality and fidelity of the information that team members can input. In conceptual aircraft design, structural information is of lower quality compared to aerodynamics or performance because it is based on experience rather than theory. This dissertation shows how advanced structural design tools can be used in a multidisciplinary team setting to improve structural information generation and communication through a systematic evolution of structural detail. When applied to conceptual design, finite element-based structural design tools elevate structural information to the same level as other computationally supported disciplines. This improved ability to generate and communicate structural information enables a design team to better identify and meet structural design requirements, consider producibility issues earlier, and evaluate structural concepts. A design process experiment of a wing structural layout in collaboration with an industrial partner illustrates and validates the approach.

  16. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  17. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  18. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  19. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  20. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  1. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  2. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  3. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  4. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  5. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  6. Process modelling and die design concepts for forming aircraft sheet parts

    NASA Astrophysics Data System (ADS)

    Hatipoğlu, H. A.; Alkaş, C. O.

    2016-08-01

    This study is about typical sheet metal forming processes applied in aerospace industry including flexform, stretch form and stretch draw. Each process is modelled by using finite element method for optimization. Tensile, bulge, forming limit and friction tests of commonly used materials are conducted for defining the hardening curves, yield loci, anisotropic constants, forming limit curves and friction coefficients between die and sheet. Process specific loadings and boundary conditions are applied to each model. The models are then validated by smartly designed experiments that characterize the related forming processes. Lastly, several examples are given in which those models are used to predict the forming defects before physical forming and necessary die design and process parameter changes are applied accordingly for successful forming operations.

  7. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  8. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. PMID:24309506

  9. Studies in process modeling, design, monitoring, and control, with applications to polymer composites manufacturing

    NASA Astrophysics Data System (ADS)

    Srinivasagupta, Deepak

    2002-01-01

    High material and manufacturing costs have hindered the introduction of advanced polymer composite materials into mainstream civilian applications such as automotive. Even though high-fidelity models for several polymer composite manufacturing processes have become available over the past several years and offer significant benefits in manufacturing cost reduction, concerns about their inflexibility and maintenance has adversely affected their widespread usage. This research seeks to advance process modeling and design in polymer composites manufacturing to address these concerns. Other more general issues in measurement validation and distributed control are also addressed. Using a rigorous 3-D model of the injected pultrusion (IP) process validated recently, an algorithm was developed for process and equipment design with integrated economic, operability and environmental considerations. The optimum design promised enhanced throughput as well as reduction in the time and expenses of the current purely experimental approaches. Scale-up issues in IP were analyzed, and refinements to overcome some drawbacks in the model were suggested. The process model was then extended to simulate the co-injection resin transfer molding (CIRTM) process used for manufacture of foam-core sandwich composites. A 1-D isothermal model for real-time control was also developed. Process optimization using these models and experimental parametric studies increased the debond fracture toughness of sandwiches by 78% over current technology. To ensure the availability of validated measurements from process instrumentation, a novel in-situ sensor modeling approach to sensor validation was proposed. Both active and passive, time and frequency domain techniques were developed, and experimentally verified using temperature and flow sensors. A model-based dynamic estimator to predict the true measurement online was also validated. The effect of network communication delay on stability and control

  10. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  11. Letter Report. Defense Waste Processing Facility Pour Spout Heaters - Conceptual Designs and Modeling

    SciTech Connect

    SK Sundaram; JM Perez, Jr.

    2000-09-06

    The Tanks Focus Area (TFA) identified a major task to address performance limitations and deficiencies of the Defense Waste Processing Facility (DWPF) now in its sixth year of operation. Design, installation, testing, monitoring, operability, and a number of other characteristics were studied by research personnel collaboratively at a number of facilities: Savannah River Technology Center (SRTC), Clemson Environmental Technologies Laboratory (CETL), Pacific Northwest National Laboratory (PNNL), and the Idaho National Engineering and Environmental Laboratory (INEEL). Because the potential limiting feature to the DWPF was identified as the pour spout/riser heater, researches on alternative design concepts originally proposed in the past were revisited. In the original works, finite element modeling was performed to evaluate temperature distribution and stress of the design currently used at the DWPF. Studies were also made to define the requirements of the design and to consider the approaches for remote removal/replacement. Their heater type/location, their remotely replaceable thermocouples, and their capabilities for remote handling characterized the five alternative designs proposed. Review comments on the alternative designs indicated a relatively wide range of advantages and disadvantages of the designs. The present report provides an overview of the design criteria, modeling results, and alternative designs. Based on a review of the past design optimization activities and an assessment of recent experience, recommendations are proposed for future consideration and improvement.

  12. A Conceptual Aerospace Vehicle Structural System Modeling, Analysis and Design Process

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek

    2007-01-01

    A process for aerospace structural concept analysis and design is presented, with examples of a blended-wing-body fuselage, a multi-bubble fuselage concept, a notional crew exploration vehicle, and a high altitude long endurance aircraft. Aerospace vehicle structures must withstand all anticipated mission loads, yet must be designed to have optimal structural weight with the required safety margins. For a viable systems study of advanced concepts, these conflicting requirements must be imposed and analyzed early in the conceptual design cycle, preferably with a high degree of fidelity. In this design process, integrated multidisciplinary analysis tools are used in a collaborative engineering environment. First, parametric solid and surface models including the internal structural layout are developed for detailed finite element analyses. Multiple design scenarios are generated for analyzing several structural configurations and material alternatives. The structural stress, deflection, strain, and margins of safety distributions are visualized and the design is improved. Over several design cycles, the refined vehicle parts and assembly models are generated. The accumulated design data is used for the structural mass comparison and concept ranking. The present application focus on the blended-wing-body vehicle structure and advanced composite material are also discussed.

  13. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity

  14. Ares Upper Stage Processes to Implement Model Based Design - Going Paperless

    NASA Technical Reports Server (NTRS)

    Gregory, Melanie

    2012-01-01

    Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.

  15. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  16. New process modeling [sic], design, and control strategies for energy efficiency, high product quality, and improved productivity in the process industries. Final project report

    SciTech Connect

    Ray, W. Harmon

    2002-06-05

    This project was concerned with the development of process design and control strategies for improving energy efficiency, product quality, and productivity in the process industries. In particular, (i) the resilient design and control of chemical reactors, and (ii) the operation of complex processing systems, was investigated. Specific topics studied included new process modeling procedures, nonlinear controller designs, and control strategies for multiunit integrated processes. Both fundamental and immediately applicable results were obtained. The new design and operation results from this project were incorporated into computer-aided design software and disseminated to industry. The principles and design procedures have found their way into industrial practice.

  17. [New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries

    SciTech Connect

    Not Available

    1991-12-31

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  18. (New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries)

    SciTech Connect

    Not Available

    1991-01-01

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  19. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases [1]. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission [2]. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  20. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoor A; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality.

  1. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases.1 Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission.2 Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an Advanced Design Methods (ADM) based approach. This approach applies the concepts of Design of Experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development e ort. In order to t a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  2. Process modeling and supply chain design for advanced biofuel production based on bio-oil gasification

    NASA Astrophysics Data System (ADS)

    Li, Qi

    As a potential substitute for petroleum-based fuel, second generation biofuels are playing an increasingly important role due to their economic, environmental, and social benefits. With the rapid development of biofuel industry, there has been an increasing literature on the techno-economic analysis and supply chain design for biofuel production based on a variety of production pathways. A recently proposed production pathway of advanced biofuel is to convert biomass to bio-oil at widely distributed small-scale fast pyrolysis plants, then gasify the bio-oil to syngas and upgrade the syngas to transportation fuels in centralized biorefinery. This thesis aims to investigate two types of assessments on this bio-oil gasification pathway: techno-economic analysis based on process modeling and literature data; supply chain design with a focus on optimal decisions for number of facilities to build, facility capacities and logistic decisions considering uncertainties. A detailed process modeling with corn stover as feedstock and liquid fuels as the final products is presented. Techno-economic analysis of the bio-oil gasification pathway is also discussed to assess the economic feasibility. Some preliminary results show a capital investment of 438 million dollar and minimum fuel selling price (MSP) of $5.6 per gallon of gasoline equivalent. The sensitivity analysis finds that MSP is most sensitive to internal rate of return (IRR), biomass feedstock cost, and fixed capital cost. A two-stage stochastic programming is formulated to solve the supply chain design problem considering uncertainties in biomass availability, technology advancement, and biofuel price. The first-stage makes the capital investment decisions including the locations and capacities of the decentralized fast pyrolysis plants and the centralized biorefinery while the second-stage determines the biomass and biofuel flows. The numerical results and case study illustrate that considering uncertainties can be

  3. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  4. The Sulfur-Iodine Cycle: Process Analysis and Design Using Comprehensive Phase Equilibrium Measurements and Modeling

    SciTech Connect

    Thies, Mark C.; O'Connell, J. P.; Gorensek, Maximilian B.

    2010-01-10

    Of the 100+ thermochemical hydrogen cycles that have been proposed, the Sulfur-Iodine (S-I) Cycle is a primary target of international interest for the centralized production of hydrogen from nuclear power. However, the cycle involves complex and highly nonideal phase behavior at extreme conditions that is only beginning to be understood and modeled for process simulation. The consequence is that current designs and efficiency projections have large uncertainties, as they are based on incomplete data that must be extrapolated from property models. This situation prevents reliable assessment of the potential viability of the system and, even more, a basis for efficient process design. The goal of this NERI award (05-006) was to generate phase-equilibrium data, property models, and comprehensive process simulations so that an accurate evaluation of the S-I Cycle could be made. Our focus was on Section III of the Cycle, where the hydrogen is produced by decomposition of hydroiodic acid (HI) in the presence of water and iodine (I2) in a reactive distillation (RD) column. The results of this project were to be transferred to the nuclear hydrogen community in the form of reliable flowsheet models for the S-I process. Many of the project objectives were achieved. At Clemson University, a unique, tantalum-based, phase-equilibrium apparatus incorporating a view cell was designed and constructed for measuring fluid-phase equilibria for mixtures of iodine, HI, and water (known as HIx) at temperatures to 350 °C and pressures to 100 bar. Such measurements were of particular interest for developing a working understanding of the expected operation of the RD column in Section III. The view cell allowed for the IR observation and discernment of vapor-liquid (VL), liquid-liquid, and liquid-liquid-vapor (LLVE) equilibria for HIx systems. For the I2-H2O system, liquid-liquid equilibrium (LLE) was discovered to exist at temperatures up to 310-315 °C, in contrast to the models and

  5. Statistics-enhanced multistage process models for integrated design &manufacturing of poly (vinyl alcohol) treated buckypaper

    NASA Astrophysics Data System (ADS)

    Wang, Kan

    Carbon nanotube (CNT) is considered a promising engineering material because of its exceptional mechanical, electrical, and thermal properties. Buckypaper (BP), a thin sheet of assembled CNTs, is an effective way to handle CNTs in macro scale. Pristine BP is a fragile material which is held together by weak van der Waals attractions among CNTs. This dissertation introduces a modified filtration based manufacturing process which uses poly (vinyl alcohol) (PVA) to treat BP. This treatment greatly improves the handleability of BP, reduces the spoilage during transferring, and shortens the production time. The multistage manufacturing process of PVA-treated BP is discussed in this dissertation, and process models are developed to predict the nanostructure of final products from the process parameters. Based on the nanostructure, a finite element based physical model for prediction of Young's modulus is also developed. This accuracy of this physical model is further improved by statistical methods. The aim of this study is to investigate and improve the scalability of the manufacturing process of PVA-treated BP. To achieve this goal, various statistical tools are employed. The unique issues in nanomanufacturing also motivate the development of new statistical tools and modification of existing tools. Those issues include the uncertainties in nanostructure characterization due to the scale, limited number experimental data due to high cost of raw materials, large variation in final product due to the random nature in structure, and the high complexity in physical models due to the small scale of structural building blocks. This dissertation addresses those issues by combining engineering field knowledge and statistical methods. The resulting statistics-enhanced physical model provides an approach to design the manufacturing process of PVA-treated BP for a targeting property and tailor the robustness of the final product by manipulating the process parameters. In addition

  6. Optimal design of protein production plants with time and size factor process models.

    PubMed

    Montagna, J M; Vecchietti, A R; Iribarren, O A; Pinto, J M; Asenjo, J A

    2000-01-01

    In this work we propose an optimization model for the design of a biotechnological multiproduct batch plant. A first level of detail posynomial model is constructed for each unit, as well as decisions regarding the structural optimization of the plant. A particular feature of this model is that it contains composite units in which semicontinuous items operate on the material contained by batch items. This occurs in the purification steps, in particular with the microfilters operating between retentate and permeate vessels, and with the homogenizer and ultrafilters operating on the material contained in a batch holding vessel. Also, the unit models rely on batch operating time expressions that depend on both the batch size and the size of semicontinuous items. The model takes into account all of the available options to increase the efficiency of the batch plant design: unit duplication in-phase and out-of-phase and intermediate storage tanks. The resulting mathematical model for the minimization of the plant capital cost is a mixed integer non-linear program (MINLP), which is solved to global optimality with an implementation of the outer approximation/ equality relaxation/ augmented penalty (OA/ER/AP) method. A plant that produces four recombinant proteins in eight processing stages is used to illustrate the proposed approach. An interesting feature of this example is that it represents an attempt to standardize a plant for the production of both therapeutic and nontherapeutic proteins; the model applied is generic and can thus be applied to any such modular plant. Results indicate that the best solution in terms of minimal capital cost contains no units in parallel and with intermediate storage tank allocation.

  7. Designing reservoirs for 1/t decoherence process in Jaynes-Cummings model

    NASA Astrophysics Data System (ADS)

    Giraldi, F.; Petruccione, F.

    2012-09-01

    Decoherence indicates the process that a quantum system undergoes through the interaction with its external environment. A two-level system (qubit) interacting with Lorentzian type continuous distributions of field modes, according to the Jaynes-Cummings model, provides exponential like relaxations of the reduced density matrix. In this scenario, a special class of reservoirs is designed in order to control or delay the destructive effect of the environment on qubit coherence. In this way, decoherence processes slower than the usual exponential ones are obtained: over estimated long time scales, inverse power law relaxations reveal with powers decreasing continuously to unity according to the choice of the particular reservoir. The designed reservoirs exhibit a photonic band gap coinciding with the qubit transition frequency and are piecewise similar to those usually adopted in Quantum Optics, i.e., sub-ohmic at low frequencies and inverse power laws at high frequencies. Initially, the reservoir is assumed to be in the vacuum state and is unentangled from the qubit versing in a generic state. The exact dynamics results to be described by series of Fox H-functions. The simple form of the designed reservoir can be accessible experimentally.

  8. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in

  9. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  10. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering.

    PubMed

    Shanechi, Maryam M; Orsborn, Amy L; Carmena, Jose M

    2016-04-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain's behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user's motor intention during CLDA-a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter

  11. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  12. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  13. Extracting insights from electronic health records: case studies, a visual analytics process model, and design recommendations.

    PubMed

    Wang, Taowei David; Wongsuphasawat, Krist; Plaisant, Catherine; Shneiderman, Ben

    2011-10-01

    Current electronic health record (EHR) systems facilitate the storage, retrieval, persistence, and sharing of patient data. However, the way physicians interact with EHRs has not changed much. More specifically, support for temporal analysis of a large number of EHRs has been lacking. A number of information visualization techniques have been proposed to alleviate this problem. Unfortunately, due to their limited application to a single case study, the results are often difficult to generalize across medical scenarios. We present the usage data of Lifelines2 (Wang et al. 2008), our information visualization system, and user comments, both collected over eight different medical case studies. We generalize our experience into a visual analytics process model for multiple EHRs. Based on our analysis, we make seven design recommendations to information visualization tools to explore EHR systems.

  14. Mathematical modeling and analysis of EDM process parameters based on Taguchi design of experiments

    NASA Astrophysics Data System (ADS)

    Laxman, J.; Raj, K. Guru

    2015-12-01

    Electro Discharge Machining is a process used for machining very hard metals, deep and complex shapes by metal erosion in all types of electro conductive materials. The metal is removed through the action of an electric discharge of short duration and high current density between the tool and the work piece. The eroded metal on the surface of both work piece and the tool is flushed away by the dielectric fluid. The objective of this work is to develop a mathematical model for an Electro Discharge Machining process which provides the necessary equations to predict the metal removal rate, electrode wear rate and surface roughness. Regression analysis is used to investigate the relationship between various process parameters. The input parameters are taken as peak current, pulse on time, pulse off time, tool lift time. and the Metal removal rate, electrode wear rate and surface roughness are as responses. Experiments are conducted on Titanium super alloy based on the Taguchi design of experiments i.e. L27 orthogonal experiments.

  15. Developing a User-process Model for Designing Menu-based Interfaces: An Exploratory Study.

    ERIC Educational Resources Information Center

    Ju, Boryung; Gluck, Myke

    2003-01-01

    The purpose of this study was to organize menu items based on a user-process model and implement a new version of current software for enhancing usability of interfaces. A user-process model was developed, drawn from actual users' understanding of their goals and strategies to solve their information needs by using Dervin's Sense-Making Theory…

  16. Standardization Process for Space Radiation Models Used for Space System Design

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Daly, Eamonn; Brautigam, Donald

    2005-01-01

    The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.

  17. Design and control of energy efficient food drying processes with specific reference to quality; Model development and experimental studies: Moisture movement and dryer design

    SciTech Connect

    Kim, M.; Litchfield, B.; Singh, R.; Liang, H.; Narsimhan, G.; Waananen, K.

    1989-08-01

    The ultimate goal of the project is to develop procedures, techniques, data and other information that will aid in the design of cost effective and energy efficient drying processes that produce high quality foods. This objective has been sought by performing studies to determine the pertinent properties of food products, by developing models to describe the fundamental phenomena of food drying and by testing the models at laboratory scale. Finally, this information is used to develop recommendations and strategies for improved dryer design and control. This volume, Model Development and Experimental Studies, emphasizes the direct and indirect drying processes. An extensive literature review identifies key characteristics of drying models including controlling process resistances, internal mechanisms of moisture movement, structural and thermodynamic assumptions, and methods of model coefficients and material property measurement/determination, model solution, and model validation. Similarities and differences between previous work are noted, and strategies for future drying model development are suggested.

  18. Optimized aerodynamic design process for subsonic transport wing fitted with winglets. [wind tunnel model

    NASA Technical Reports Server (NTRS)

    Kuhlman, J. M.

    1979-01-01

    The aerodynamic design of a wind-tunnel model of a wing representative of that of a subsonic jet transport aircraft, fitted with winglets, was performed using two recently developed optimal wing-design computer programs. Both potential flow codes use a vortex lattice representation of the near-field of the aerodynamic surfaces for determination of the required mean camber surfaces for minimum induced drag, and both codes use far-field induced drag minimization procedures to obtain the required spanloads. One code uses a discrete vortex wake model for this far-field drag computation, while the second uses a 2-D advanced panel wake model. Wing camber shapes for the two codes are very similar, but the resulting winglet camber shapes differ widely. Design techniques and considerations for these two wind-tunnel models are detailed, including a description of the necessary modifications of the design geometry to format it for use by a numerically controlled machine for the actual model construction.

  19. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  20. Fracture design modelling

    SciTech Connect

    Crichlow, H.B.; Crichlow, H.B.

    1980-02-07

    A design tool is discussed whereby the various components that enter the design process of a hydraulic fracturing job are combined to provide a realistic appraisal of a stimulation job in the field. An interactive computer model is used to solve the problem numerically to obtain the effects of various parameters on the overall behavior of the system.

  1. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry.

  2. Incorporating Eco-Evolutionary Processes into Population Models:Design and Applications

    EPA Science Inventory

    Eco-evolutionary population models are powerful new tools for exploring howevolutionary processes influence plant and animal population dynamics andvice-versa. The need to manage for climate change and other dynamicdisturbance regimes is creating a demand for the incorporation of...

  3. Design of RTDA controller for industrial process using SOPDT model with minimum or non-minimum zero.

    PubMed

    Anbarasan, K; Srinivasan, K

    2015-07-01

    This research paper focuses on the design and development of simplified RTDA control law computation formulae for SOPDT process with minimum or non-minimum zero. The design of RTDA control scheme consists of three main components namely process output prediction, model prediction update and control action computation. The systematic approach for computation of the above three components for SOPDT process with minimum or non-minimum zero is developed in this paper. The design, implementation and performance evaluation of the developed controller is demonstrated via simulation examples. The closed loop equation, block diagram representation and theoretical stability derivation for RTDA controller are developed. The performance of proposed controller is compared with IMC, SPC, MPC and PID controller and it is demonstrated on Industrial non-linear CSTR process.

  4. Design of RTDA controller for industrial process using SOPDT model with minimum or non-minimum zero.

    PubMed

    Anbarasan, K; Srinivasan, K

    2015-07-01

    This research paper focuses on the design and development of simplified RTDA control law computation formulae for SOPDT process with minimum or non-minimum zero. The design of RTDA control scheme consists of three main components namely process output prediction, model prediction update and control action computation. The systematic approach for computation of the above three components for SOPDT process with minimum or non-minimum zero is developed in this paper. The design, implementation and performance evaluation of the developed controller is demonstrated via simulation examples. The closed loop equation, block diagram representation and theoretical stability derivation for RTDA controller are developed. The performance of proposed controller is compared with IMC, SPC, MPC and PID controller and it is demonstrated on Industrial non-linear CSTR process. PMID:25820089

  5. Testing the Theoretical Design of a Health Risk Message: Reexamining the Major Tenets of the Extended Parallel Process Model

    ERIC Educational Resources Information Center

    Gore, Thomas D.; Bracken, Cheryl Campanella

    2005-01-01

    This study examined the fear control/danger control responses that are predicted by the Extended Parallel Process Model (EPPM). In a campaign designed to inform college students about the symptoms and dangers of meningitis, participants were given either a high-threat/no-efficacy or high-efficacy/no-threat health risk message, thus testing the…

  6. Enhancing the Design Process for Complex Space Systems through Early Integration of Risk and Variable-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri; Osburg, Jan

    2005-01-01

    An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.

  7. Computer modeling of high-pressure leaching process of nickel laterite by design of experiments and neural networks

    NASA Astrophysics Data System (ADS)

    Milivojevic, Milovan; Stopic, Srecko; Friedrich, Bernd; Stojanovic, Boban; Drndarevic, Dragoljub

    2012-07-01

    Due to the complex chemical composition of nickel ores, the requests for the decrease of production costs, and the increase of nickel extraction in the existing depletion of high-grade sulfide ores around the world, computer modeling of nickel ore leaching process became a need and a challenge. In this paper, the design of experiments (DOE) theory was used to determine the optimal experimental design plan matrix based on the D optimality criterion. In the high-pressure sulfuric acid leaching (HPSAL) process for nickel laterite in "Rudjinci" ore in Serbia, the temperature, the sulfuric acid to ore ratio, the stirring speed, and the leaching time as the predictor variables, and the degree of nickel extraction as the response have been considered. To model the process, the multiple linear regression (MLR) and response surface method (RSM), together with the two-level and four-factor full factorial central composite design (CCD) plan, were used. The proposed regression models have not been proven adequate. Therefore, the artificial neural network (ANN) approach with the same experimental plan was used in order to reduce operational costs, give a better modeling accuracy, and provide a more successful process optimization. The model is based on the multi-layer neural networks with the back-propagation (BP) learning algorithm and the bipolar sigmoid activation function.

  8. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  9. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  10. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling

    PubMed Central

    F. Pradier, Melanie; J. R. Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners’ performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  11. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling.

    PubMed

    F Pradier, Melanie; J R Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners' performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  12. Estimation of design space for an extrusion-spheronization process using response surface methodology and artificial neural network modelling.

    PubMed

    Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza

    2016-09-01

    The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented

  13. Estimation of design space for an extrusion-spheronization process using response surface methodology and artificial neural network modelling.

    PubMed

    Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza

    2016-09-01

    The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented.

  14. Networks in cognitive systems and biomedicine: cerebral processes, models and mathematical tools design.

    PubMed

    Cosmelli, Diego; Palacios, Adrián G

    2007-01-01

    Convergence of clinical, empirical, methodological and theoretical approaches aimed at understanding the relation between brain function and cognition, is by now standard in most if not all academic programs in the area of Cognitive Science. This confederation of disciplines is one of the liveliest domains of inquiry and discussion into some of the most fundamental--and historically resilient--questions human beings have posed themselves. The contributions gathered in this special issue of Biological Research, directly inspired by the ongoing work at the Instituto de Sistemas Complejos de Valparaiso and the December 2006 CONICYT-INSERM-SFI workshop "Networks in Cognitive Systems/Trends and Challenge in Biomedicine: From Cerebral Process to Mathematical Tools Design", Chile, represent an explicit invitation to the reader to dive deeper into this fascinating terrain.

  15. Statistical experimental design for bioprocess modeling and optimization analysis: repeated-measures method for dynamic biotechnology process.

    PubMed

    Lee, Kwang-Min; Gilmore, David F

    2006-11-01

    The statistical design of experiments (DOE) is a collection of predetermined settings of the process variables of interest, which provides an efficient procedure for planning experiments. Experiments on biological processes typically produce long sequences of successive observations on each experimental unit (plant, animal, bioreactor, fermenter, or flask) in response to several treatments (combination of factors). Cell culture and other biotech-related experiments used to be performed by repeated-measures method of experimental design coupled with different levels of several process factors to investigate dynamic biological process. Data collected from this design can be analyzed by several kinds of general linear model (GLM) statistical methods such as multivariate analysis of variance (MANOVA), univariate ANOVA (time split-plot analysis with randomization restriction), and analysis of orthogonal polynomial contrasts of repeated factor (linear coefficient analysis). Last, regression model was introduced to describe responses over time to the different treatments along with model residual analysis. Statistical analysis of biprocess with repeated measurements can help investigate environmental factors and effects affecting physiological and bioprocesses in analyzing and optimizing biotechnology production. PMID:17159235

  16. Model-based control structure design of a full-scale WWTP under the retrofitting process.

    PubMed

    Machado, V C; Lafuente, J; Baeza, J A

    2015-01-01

    The anoxic-oxic (A/O) municipal wastewater treatment plant (WWTP) of Manresa (Catalonia, Spain) was studied for a possible conversion to an anaerobic/anoxic/oxic (A2/O) configuration to promote enhanced biological phosphorus removal. The control structure had to be redesigned to satisfy the new necessity to control phosphorus concentration, besides ammonium and nitrate concentrations (main pollutant concentrations). Thereby, decentralized control structures with proportional-integral-derivative (PID) controllers and centralized control structures with model-predictive controllers (MPC) were designed and tested. All the designed control structures had their performance systematically tested regarding effluent quality and operating costs. The centralized control structure, A2/O-3-MPC, achieved the lowest operating costs with the best effluent quality using the A2/O plant configuration for the Manresa WWTP. The controlled variables used in this control structure were ammonium in the effluent, nitrate at the end of the anoxic zone and phosphate at the end of the anaerobic zone, while the manipulated variables were the internal and external recycle flow rates and the dissolved oxygen setpoint in the aerobic reactors. PMID:26038931

  17. Model-based design and integration of a two-step biopharmaceutical production process.

    PubMed

    Otero, Bruno; Degerman, Marcus; Hansen, Thomas Budde; Hansen, Ernst Broberg; Nilsson, Bernt

    2014-10-01

    This paper presents the design of a two-step process in which the first step is PEGylation of a protein, and the second step is chromatographic purification of the target mono-PEGylated protein from the unreacted and the di-PEGylated impurities. The difference between optimizing each process step separately and optimizing them simultaneously is studied. It was found that by optimizing the steps simultaneously up to a 100 % increase in productivity could be obtained without reduction in yield. Optimizing both steps at the same time makes it possible for the optimization method to take into account that the di-PEGylated protein is much easier to separate than the non-PEGylated protein. The easier separation makes it possible to get a higher yield and productivity at the same time. The effect of recycling was also studied and the yield could be increased by 30 % by recycling the unreacted protein. However, if maximum productivity is required, batch mode is preferable.

  18. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Matthew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design can have a profound impact on life-cycle cost (LCC). Widely accepted that nearly 80% of LCC is committed. Decisions made during early design must be well informed. Advanced Concepts Office (ACO) at Marshall Space Flight Center aids in decision making for launch vehicles. Provides rapid turnaround pre-phase A and phase A studies. Provides customer with preliminary vehicle sizing information, vehicle feasibility, and expected performance.

  19. Model Based Structural Evaluation & Design of Overpack Container for Bag-Buster Processing of TRU Waste Drums

    SciTech Connect

    D. T. Clark; A. S. Siahpush; G. L. Anderson

    2004-07-01

    This paper describes a materials and computational model based analysis utilized to design an engineered “overpack” container capable of maintaining structural integrity for confinement of transuranic wastes undergoing the cryo-vacuum stress based “Bag-Buster” process and satisfying DOT 7A waste package requirements. The engineered overpack is a key component of the “Ultra-BagBuster” process/system being commercially developed by UltraTech International for potential DOE applications to non-intrusively breach inner confinement layers (poly bags/packaging) within transuranic (TRU) waste drums. This system provides a lower cost/risk approach to mitigate hydrogen gas concentration buildup limitations on transport of high alpha activity organic transuranic wastes. Four evolving overpack design configurations and two materials (low carbon steel and 300 series stainless) were considered and evaluated using non-linear finite element model analyses of structural response. Properties comparisons show that 300-series stainless is required to provide assurance of ductility and structural integrity at both room and cryogenic temperatures. The overpack designs were analyzed for five accidental drop impact orientations onto an unyielding surface (dropped flat on bottom, bottom corner, side, top corner, and top). The first three design configurations failed the bottom and top corner drop orientations (flat bottom, top, and side plates breached or underwent material failure). The fourth design utilized a protruding rim-ring (skirt) below the overpack’s bottom plate and above the overpack’s lid plate to absorb much of the impact energy and maintained structural integrity under all accidental drop loads at both room and cryogenic temperature conditions. Selected drop testing of the final design will be required to confirm design performance.

  20. Exploring business process modelling paradigms and design-time to run-time transitions

    NASA Astrophysics Data System (ADS)

    Caron, Filip; Vanthienen, Jan

    2016-09-01

    The business process management literature describes a multitude of approaches (e.g. imperative, declarative or event-driven) that each result in a different mix of process flexibility, compliance, effectiveness and efficiency. Although the use of a single approach over the process lifecycle is often assumed, transitions between approaches at different phases in the process lifecycle may also be considered. This article explores several business process strategies by analysing the approaches at different phases in the process lifecycle as well as the various transitions.

  1. Understanding the Process Model of Leadership: Follower Attribute Design and Assessment

    ERIC Educational Resources Information Center

    Antelo, Absael; Henderson, Richard L.; St. Clair, Norman

    2010-01-01

    Early leadership studies produced significant research findings that have helped differentiate between leader and follower personal attributes and their consequent behaviors (SEDL, 1992), but little attention was given to the follower's contribution to the leadership process. This study represents a continuation of research by Henderson, Antelo, &…

  2. ATOMIC-LEVEL MODELING OF CO2 DISPOSAL AS A CARBONATE MINERAL: A SYNERGETIC APPROACH TO OPTIMIZING REACTION PROCESS DESIGN

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; J.B. Adams

    2001-11-01

    Fossil fuels, especially coal, can support the energy demands of the world for centuries to come, if the environmental problems associated with CO{sub 2} emissions can be overcome. Permanent and safe methods for CO{sub 2} capture and disposal/storage need to be developed. Mineralization of stationary-source CO{sub 2} emissions as carbonates can provide such safe capture and long-term sequestration. Mg-rich lamellar hydroxide mineral carbonation is a leading process candidate, which generates the stable naturally occurring mineral magnesite (MgCO{sub 3}) and water. Key to process cost and viability are the carbonation reaction rate and its degree of completion. This process, which involves simultaneous dehydroxylation and carbonation is very promising, but far from optimized. In order to optimize the dehydroxylation/carbonation process, an atomic-level understanding of the mechanisms involved is needed. In this investigation Mg(OH){sub 2} was selected as a model Mg-rich lamellar hydrocide carbonation feedstock material due to its chemical and structural simplicity. Since Mg(OH){sub 2} dehydroxylation is intimately associated with the carbonation process, its mechanisms are also of direct interest in understanding and optimizing the process. The aim of the current innovative concepts project is to develop a specialized advanced computational methodology to complement the ongoing experimental inquiry of the atomic level processes involved in CO{sub 2} mineral sequestration. The ultimate goal is to integrate the insights provided by detailed predictive simulations with the data obtained from optical microscopy, FESEM, ion beam analysis, SIMS, TGA, Raman, XRD, and C and H elemental analysis. The modeling studies are specifically designed to enhance the synergism with, and complement the analysis of, existing mineral-CO{sub 2} reaction process studies being carried out under DOE UCR Grant DE-FG2698-FT40112. Direct contact between the simulations and the experimental

  3. Model parameters conditioning on regional hydrologic signatures for process-based design flood estimation in ungauged basins.

    NASA Astrophysics Data System (ADS)

    Biondi, Daniela; De Luca, Davide Luciano

    2015-04-01

    The use of rainfall-runoff models represents an alternative to statistical approaches (such as at-site or regional flood frequency analysis) for design flood estimation, and constitutes an answer to the increasing need for synthetic design hydrographs (SDHs) associated to a specific return period. However, the lack of streamflow observations and the consequent high uncertainty associated with parameter estimation, usually pose serious limitations to the use of process-based approaches in ungauged catchments, which in contrast represent the majority in practical applications. This work presents the application of a Bayesian procedure that, for a predefined rainfall-runoff model, allows for the assessment of posterior parameters distribution, using the limited and uncertain information available for the response of an ungauged catchment (Bulygina et al. 2009; 2011). The use of regional estimates of river flow statistics, interpreted as hydrological signatures that measure theoretically relevant system process behaviours (Gupta et al. 2008), within this framework represents a valuable option and has shown significant developments in recent literature to constrain the plausible model response and to reduce the uncertainty in ungauged basins. In this study we rely on the first three L-moments of annual streamflow maxima, for which regressions are available from previous studies (Biondi et al. 2012; Laio et al. 2011). The methodology was carried out for a catchment located in southern Italy, and used within a Monte Carlo scheme (MCs) considering both event-based and continuous simulation approaches for design flood estimation. The applied procedure offers promising perspectives to perform model calibration and uncertainty analysis in ungauged basins; moreover, in the context of design flood estimation, process-based methods coupled with MCs approach have the advantage of providing simulated floods uncertainty analysis that represents an asset in risk-based decision

  4. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  5. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  6. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  7. Computational design of the basic dynamical processes of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Lamb, V. R.

    1977-01-01

    The 12-layer UCLA general circulation model encompassing troposphere and stratosphere (and superjacent 'sponge layer') is described. Prognostic variables are: surface pressure, horizontal velocity, temperature, water vapor and ozone in each layer, planetary boundary layer (PBL) depth, temperature, moisture and momentum discontinuities at PBL top, ground temperature and water storage, and mass of snow on ground. Selection of space finite-difference schemes for homogeneous incompressible flow, with/without a free surface, nonlinear two-dimensional nondivergent flow, enstrophy conserving schemes, momentum advection schemes, vertical and horizontal difference schemes, and time differencing schemes are discussed.

  8. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  9. Process of inorganic nitrogen transformation and design of kinetics model in the biological aerated filter reactor.

    PubMed

    Yan, Gang; Xu, Xia; Yao, Lirong; Lu, Liqiao; Zhao, Tingting; Zhang, Wenyi

    2011-04-01

    As one of the plug-flow reactors, biological aerated filter (BAF) reactor was divided into four sampling sectors to understand the characteristics of elemental nitrogen transformation during the reaction process, and then the different characteristics of elemental nitrogen transformation caused by different NH(3)-N loadings, biological quantities and activities in each section were obtained. The results showed that the total transformation ratio in the nitrifying reactor was more than 90% in the absence of any organic carbon resource, at the same time, more than 65% NH(3)-N in the influent were nitrified at the filter height of 70 cm below under the conditions of the influent runoff 9-19 L/h, the gas-water ratio 4-5:1, the dissolved oxygen 3.0-5.8 mg/L and the NH(3)-N load 0.28-0.48 kg NH(3)-N/m(3) d. On the base of the Eckenfelder mode, the kinetics equation of the NH(3)-N transformation along the reactor was S(e)=S(0) exp(-0.0134D/L(1.2612)).

  10. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  11. Biological neural networks as model systems for designing future parallel processing computers

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    One of the more interesting debates of the present day centers on whether human intelligence can be simulated by computer. The author works under the premise that neurons individually are not smart at all. Rather, they are physical units which are impinged upon continuously by other matter that influences the direction of voltage shifts across the units membranes. It is only the action of a great many neurons, billions in the case of the human nervous system, that intelligent behavior emerges. What is required to understand even the simplest neural system is painstaking analysis, bit by bit, of the architecture and the physiological functioning of its various parts. The biological neural network studied, the vestibular utricular and saccular maculas of the inner ear, are among the most simple of the mammalian neural networks to understand and model. While there is still a long way to go to understand even this most simple neural network in sufficient detail for extrapolation to computers and robots, a start was made. Moreover, the insights obtained and the technologies developed help advance the understanding of the more complex neural networks that underlie human intelligence.

  12. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  13. Ligand modeling and design

    SciTech Connect

    Hay, B.P.

    1997-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used in the cost-effective removal of specific radionuclides from nuclear waste streams. Organic ligands with metal ion specificity are critical components in the development of solvent extraction and ion exchange processes that are highly selective for targeted radionuclides. The traditional approach to the development of such ligands involves lengthy programs of organic synthesis and testing, which in the absence of reliable methods for screening compounds before synthesis, results in wasted research effort. The author`s approach breaks down and simplifies this costly process with the aid of computer-based molecular modeling techniques. Commercial software for organic molecular modeling is being configured to examine the interactions between organic ligands and metal ions, yielding an inexpensive, commercially or readily available computational tool that can be used to predict the structures and energies of ligand-metal complexes. Users will be able to correlate the large body of existing experimental data on structure, solution binding affinity, and metal ion selectivity to develop structural design criteria. These criteria will provide a basis for selecting ligands that can be implemented in separations technologies through collaboration with other DOE national laboratories and private industry. The initial focus will be to select ether-based ligands that can be applied to the recovery and concentration of the alkali and alkaline earth metal ions including cesium, strontium, and radium.

  14. Reengineering the project design process

    NASA Astrophysics Data System (ADS)

    Kane Casani, E.; Metzger, Robert M.

    1995-01-01

    In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.

  15. Process simulation and design '94

    SciTech Connect

    Not Available

    1994-06-01

    This first-of-a-kind report describes today's process simulation and design technology for specific applications. It includes process names, diagrams, applications, descriptions, objectives, economics, installations, licensors, and a complete list of process submissions. Processes include: alkylation, aromatics extraction, catalytic reforming, cogeneration, dehydration, delayed coking, distillation, energy integration, catalytic cracking, gas sweetening, glycol/methanol injection, hydrocracking, NGL recovery and stabilization, solvent dewaxing, visbreaking. Equipment simulations include: amine plant, ammonia plant, heat exchangers, cooling water network, crude preheat train, crude unit, ethylene furnace, nitrogen rejection unit, refinery, sulfur plant, and VCM furnace. By-product processes include: olefins, polyethylene terephthalate, and styrene.

  16. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  17. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  18. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  19. Design Process Guide Method for Minimizing Loops and Conflicts

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    We propose a new guide method for developing an easy-to-design process for product development. This process ensures a smaller number of wasteful iterations and less multiple conflicts. The design process is modeled as a sequence of design decisions. A design decision is defined as the process of determination of product attributes. A design task is represented as a calculation flow that depends on the product constraints between the product attributes. We also propose an automatic planning algorithm for the execution of the design task, in order to minimize the design loops and design conflicts. Further, we validate the effectiveness of the proposed guide method by developing a prototype design system and a design example of piping for a power steering system. We find that the proposed method can successfully minimize design loops and design conflicts. This paper addresses (1) a design loop model, (2) a design conflict model, and (3) how to minimize design loops and design conflicts.

  20. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  1. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  2. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  3. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  4. Molecular modeling of directed self-assembly of block copolymers: Fundamental studies of processing conditions and evolutionary pattern design

    NASA Astrophysics Data System (ADS)

    Khaira, Gurdaman Singh

    Rapid progress in the semi-conductor industry has pushed for smaller feature sizes on integrated electronic circuits. Current photo-lithographic techniques for nanofabrication have reached their technical limit and are problematic when printing features small enough to meet future industrial requirements. "Bottom-up'' techniques, such as the directed self-assembly (DSA) of block copolymers (BCP), are the primary contenders to compliment current "top-down'' photo-lithography ones. For industrial requirements, the defect density from DSA needs to be less than 1 defect per 10 cm by 10 cm. Knowledge of both material synthesis and the thermodynamics of the self-assembly process are required before optimal operating conditions can be found to produce results adequate for industry. The work present in this thesis is divided into three chapters, each discussing various aspects of DSA as studied via a molecular model that contains the essential physics of BCP self-assembly. Though there are various types of guiding fields that can be used to direct BCPs over large wafer areas with minimum defects, this study focuses only on chemically patterned substrates. The first chapter addresses optimal pattern design by describing a framework where molecular simulations of various complexities are coupled with an advanced optimization technique to find a pattern that directs a target morphology. It demonstrates the first ever study where BCP self-assembly on a patterned substrate is optimized using a three-dimensional description of the block-copolymers. For problems pertaining to DSA, the methodology is shown to converge much faster than the traditional random search approach. The second chapter discusses the metrology of BCP thin films using TEM tomography and X-ray scattering techniques, such as CDSAXS and GISAXS. X-ray scattering has the advantage of being able to quickly probe the average structure of BCP morphologies over large wafer areas; however, deducing the BCP morphology

  5. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  6. Functionally graded materials: Design, processing and applications

    SciTech Connect

    Miyamoto, Y.; Kaysser, W.A.; Rabin, B.H.; Kawasaki, A.; Ford, R.G.

    1999-09-01

    In a Functionally Graded Material (FGM), the composition and structure gradually change over volume, resulting in corresponding changes in the properties of the material. By applying the many possibilities inherent in the FGM concept, it is anticipated that materials will be improved and new functions for them created. A comprehensive description of design, modeling, processing, and evaluation of FGMs as well as their applications is covered in this book. The contents include: lessons from nature; graded microstructures; modeling and design; characterization of properties; processing and fabrication; applications; and summary and outlook.

  7. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  8. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  9. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  10. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  11. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  12. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  13. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  14. Application of central composite design and artificial neural network in modeling of reactive blue 21 dye removal by photo-ozonation process.

    PubMed

    Mehrizad, Ali; Gharbani, Parvin

    2016-01-01

    The present study deals with use of central composite design (CCD) and artificial neural network (ANN) in modeling and optimization of reactive blue 21 (RB21) removal from aqueous media under photo-ozonation process. Four effective operational parameters (including: initial concentration of RB21, O(3) concentration, UV light intensity and reaction time) were chosen and the experiments were designed by CCD based on response surface methodology (RSM). The obtained results from the CCD model were used in modeling the process by ANN. Under optimum condition (O(3) concentration of 3.95 mg L(-1), UV intensity of 20.5 W m(-2), reaction time of 7.77 min and initial dye concentration of 40.21 mg L(-1)), RB21 removal efficiency reached to up 98.88%. A topology of ANN with a three-layer consisting of four input neurons, 14 hidden neurons and one output neuron was designed. The relative significance of each major factor was calculated based on the connection weights of the ANN model. Dye and ozone concentrations were the most important variables in the photo-ozonation of RB21, followed by reaction time and UV light intensity. The comparison of predicted values by CCD and ANN with experimental results showed that both methods were highly efficient in the modeling of the process. PMID:27386996

  15. Application of central composite design and artificial neural network in modeling of reactive blue 21 dye removal by photo-ozonation process.

    PubMed

    Mehrizad, Ali; Gharbani, Parvin

    2016-01-01

    The present study deals with use of central composite design (CCD) and artificial neural network (ANN) in modeling and optimization of reactive blue 21 (RB21) removal from aqueous media under photo-ozonation process. Four effective operational parameters (including: initial concentration of RB21, O(3) concentration, UV light intensity and reaction time) were chosen and the experiments were designed by CCD based on response surface methodology (RSM). The obtained results from the CCD model were used in modeling the process by ANN. Under optimum condition (O(3) concentration of 3.95 mg L(-1), UV intensity of 20.5 W m(-2), reaction time of 7.77 min and initial dye concentration of 40.21 mg L(-1)), RB21 removal efficiency reached to up 98.88%. A topology of ANN with a three-layer consisting of four input neurons, 14 hidden neurons and one output neuron was designed. The relative significance of each major factor was calculated based on the connection weights of the ANN model. Dye and ozone concentrations were the most important variables in the photo-ozonation of RB21, followed by reaction time and UV light intensity. The comparison of predicted values by CCD and ANN with experimental results showed that both methods were highly efficient in the modeling of the process.

  16. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  17. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 1: Flow aerodynamics in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-02-01

    Some results of the complex experimental and numerical study of aerodynamics and transfer processes in a vortex furnace, whose design was improved via the distributed tangential injection of fuel-air flows through the upper and lower burners, were presented. The experimental study of the aerodynamic characteristics of a spatial turbulent flow was performed on the isothermal laboratory model (at a scale of 1 : 20) of an improved vortex furnace using a laser Doppler measurement system. The comparison of experimental data with the results of the numerical modeling of an isothermal flow for the same laboratory furnace model demonstrated their agreement to be acceptable for engineering practice.

  18. Response surface modeling of Carbamazepine (CBZ) removal by Graphene-P25 nanocomposites/UVA process using central composite design.

    PubMed

    Amalraj Appavoo, Initha; Hu, Jiangyong; Huang, Yan; Li, Sam Fong Yau; Ong, Say Leong

    2014-06-15

    Graphene-P25 (Gr-P25) nanocomposites were synthesized by a simple microwave hydrothermal method. The nanocomposites with different graphene loading were evaluated for the degradation of an important pharmaceutical water pollutant, Carbamazepine (CBZ) under UVA irradiation in a batch reactor. Response surface methodology (RSM) was used to optimize three key independent operating parameters, namely Gr-P25 nanocomposites dose (X1), CBZ initial concentration (X2) and UV light intensity (X3), for photocatalytic degradation of CBZ. The central composite design (CCD) consisting of 20 experiments determined by 2(3) full factorial designs with six axial points and six center points was used to conduct experiments. The results showed that CBZ removal was significantly affected by the synergistic effect of linear term of Gr-P25 dose (X1) and UV intensity (X3). However, the quadratic terms of Gr-P25 (X1(2)) and UV intensity (X3(2)) had an antagonistic effect on CBZ removal. The obtained RSM model (R(2) = 0.9206) showed a satisfactory correlation between experimental and predicted values of CBZ removal. The optimized conditions for achieving 100% CBZ removal with 5 min UVA irradiation were 25.14 mg/L, 167.68 ppb and 1.35 mW/cm(2) for Gr-P25 dose, initial concentration of CBZ and UV intensity, respectively.

  19. The watershed-scale optimized and rearranged landscape design (WORLD) model and local biomass processing depots for sustainable biofuel production: Integrated life cycle assessments

    SciTech Connect

    Eranki, Pragnya L.; Manowitz, David H.; Bals, Bryan D.; Izaurralde, Roberto C.; Kim, Seungdo; Dale, Bruce E.

    2013-07-23

    An array of feedstock is being evaluated as potential raw material for cellulosic biofuel production. Thorough assessments are required in regional landscape settings before these feedstocks can be cultivated and sustainable management practices can be implemented. On the processing side, a potential solution to the logistical challenges of large biorefi neries is provided by a network of distributed processing facilities called local biomass processing depots. A large-scale cellulosic ethanol industry is likely to emerge soon in the United States. We have the opportunity to influence the sustainability of this emerging industry. The watershed-scale optimized and rearranged landscape design (WORLD) model estimates land allocations for different cellulosic feedstocks at biorefinery scale without displacing current animal nutrition requirements. This model also incorporates a network of the aforementioned depots. An integrated life cycle assessment is then conducted over the unified system of optimized feedstock production, processing, and associated transport operations to evaluate net energy yields (NEYs) and environmental impacts.

  20. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  1. The use of exploratory experimental designs combined with thermal numerical modelling to obtain a predictive tool for hybrid laser/MIG welding and coating processes

    NASA Astrophysics Data System (ADS)

    Bidi, Lyes; Mattei, Simone; Cicala, Eugen; Andrzejewski, Henri; Le Masson, Philippe; Schroeder, Jeanne

    2011-04-01

    While hybrid laser welding and coating processes involve a large number of physical phenomena, it is currently impossible to predict, for a given set of influencing factors, the shape of the molten zone and the history of temperature fields inside the parts. This remains true for complex processes, such as the hybrid laser/MIG welding process, which consists in combining a laser beam with a MIG torch. The gains obtained result essentially from the synergy of the associated processes: the stability of the process, the quality of the seam realized, and the productivity are increased. This article shows how, by means of a reduced number of experiments (8), it is possible to predict the shape of the molten zone and the temperature field inside parts, for a given window of influencing factors. This method consists in combining the method of exploratory experimental designs with a numerical modelling of the thermal phenomena that occurs during the process, by using the 'heat equivalent source" approach [1-4]. Two validations of this method have been carried out: the first for a set of parameters inside the experimental design, and the other for a set of parameters that lies outside the experimental design, but inside the domain investigated.

  2. Models of the Reading Process

    PubMed Central

    Rayner, Keith; Reichle, Erik D.

    2010-01-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a “model of reading” when talking about only one aspect of the reading process (for example, models of word identification are often referred to as “models of reading”). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers’ eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized. PMID:21170142

  3. Models of the Reading Process.

    PubMed

    Rayner, Keith; Reichle, Erik D

    2010-11-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a "model of reading" when talking about only one aspect of the reading process (for example, models of word identification are often referred to as "models of reading"). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers' eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized.

  4. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  5. Using an Analogical Thinking Model as an Instructional Tool to Improve Student Cognitive Ability in Architecture Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua

    2013-01-01

    Lack of creativity is a problem often plaguing students from design-related departments. Therefore, this study is intended to incorporate analogical thinking in the education of architecture design to enhance students' learning and their future career performance. First, this study explores the three aspects of architecture design curricula,…

  6. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  7. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  8. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  9. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  10. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  11. Process modeling and control in foundry operations

    NASA Astrophysics Data System (ADS)

    Piwonka, T. S.

    1989-02-01

    Initial uses of process modeling were limited to phenomenological descriptions of the physical processes in foundry operations, with the aim of decreasing scrap and rework. It is now clear that process modeling can be used to select, design and optimize foundry processes so that on-line process control can be achieved. Computational, analogue and empirical process models have been developed for sand casting operations, and they are being applied in the foundry with beneficial effects.

  12. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  13. "From the Formal to the Innovative": The Use of Case Studies and Sustainable Projects in Developing a Design Process Model for Educating Product/Industrial Designers

    ERIC Educational Resources Information Center

    Oakes, G. L.; Felton, A. J.; Garner, K. B.

    2006-01-01

    The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…

  14. Hafnium transistor process design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2009-01-01

    A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.

  15. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  16. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  17. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  18. Instructional Design Processes and Traditional Colleges

    ERIC Educational Resources Information Center

    Vasser, Nichole

    2010-01-01

    Traditional colleges who have implemented distance education programs would benefit from using instructional design processes to develop their courses. Instructional design processes provide the framework for designing and delivering quality online learning programs in a highly-competitive educational market. Traditional college leaders play a…

  19. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  20. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes. PMID:23039255

  1. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  2. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  3. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  4. A Model of Continuous Improvement in High Schools: A Process for Research, Innovation Design, Implementation, and Scale

    ERIC Educational Resources Information Center

    Cohen-Vogel, Lora; Cannata, Marisa; Rutledge, Stacey A.; Socol, Allison Rose

    2016-01-01

    This chapter describes a model for continuous improvement that guides the work of the National Center on Scaling Up Effective Schools, or NCSU. NCSU is a research and development center funded by the Institute for Education Sciences, the research arm of the United States Department of Education. At the core of the Center's work is an innovative…

  5. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  6. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  7. On the use of mathematical models to build the design space for the primary drying phase of a pharmaceutical lyophilization process.

    PubMed

    Giordano, Anna; Barresi, Antonello A; Fissore, Davide

    2011-01-01

    The aim of this article is to show a procedure to build the design space for the primary drying of a pharmaceuticals lyophilization process. Mathematical simulation of the process is used to identify the operating conditions that allow preserving product quality and meeting operating constraints posed by the equipment. In fact, product temperature has to be maintained below a limit value throughout the operation, and the sublimation flux has to be lower than the maximum value allowed by the capacity of the condenser, besides avoiding choking flow in the duct connecting the drying chamber to the condenser. Few experimental runs are required to get the values of the parameters of the model: the dynamic parameters estimation algorithm, an advanced tool based on the pressure rise test, is used to this purpose. A simple procedure is proposed to take into account parameters uncertainty and, thus, it is possible to find the recipes that allow fulfilling the process constraints within the required uncertainty range. The same approach can be effective to take into account the heterogeneity of the batch when designing the freeze-drying recipe. PMID:20575053

  8. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  9. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  10. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  11. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  12. Model for Vaccine Design by Prediction of B-Epitopes of IEDB Given Perturbations in Peptide Sequence, In Vivo Process, Experimental Techniques, and Source or Host Organisms

    PubMed Central

    González-Díaz, Humberto; Pérez-Montoto, Lázaro G.; Ubeira, Florencio M.

    2014-01-01

    Perturbation methods add variation terms to a known experimental solution of one problem to approach a solution for a related problem without known exact solution. One problem of this type in immunology is the prediction of the possible action of epitope of one peptide after a perturbation or variation in the structure of a known peptide and/or other boundary conditions (host organism, biological process, and experimental assay). However, to the best of our knowledge, there are no reports of general-purpose perturbation models to solve this problem. In a recent work, we introduced a new quantitative structure-property relationship theory for the study of perturbations in complex biomolecular systems. In this work, we developed the first model able to classify more than 200,000 cases of perturbations with accuracy, sensitivity, and specificity >90% both in training and validation series. The perturbations include structural changes in >50000 peptides determined in experimental assays with boundary conditions involving >500 source organisms, >50 host organisms, >10 biological process, and >30 experimental techniques. The model may be useful for the prediction of new epitopes or the optimization of known peptides towards computational vaccine design. PMID:24741624

  13. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. PMID:25959313

  14. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application.

  15. Student Models of Instructional Design

    ERIC Educational Resources Information Center

    Magliaro, Susan G.; Shambaugh, Neal

    2006-01-01

    Mental models are one way that humans represent knowledge (Markman, 1999). Instructional design (ID) is a conceptual model for developing instruction and typically includes analysis, design, development, implementation, and evaluation (i.e., ADDIE model). ID, however, has been viewed differently by practicing teachers and instructional designers…

  16. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  17. Transonic empirical configuration design process

    NASA Technical Reports Server (NTRS)

    Whitcomb, R. T.

    1983-01-01

    This lecture describes some of the experimental research pertaining to transonic configuration development conducted by the Transonic Aerodynamics Branch of the NASA Langley Research Center. Discussions are presented of the following: use of florescent oil films for the study of surface boundary layer flows; the severe effect of wind tunnel wall interference on the measured configuration drag rise near the speed of sound as determined by a comparison between wind tunnel and free air results; the development of a near sonic transport configuration incorporating a supercritical wing and an indented fuselage, designed on the basis of the area rule with a modification to account for the presence of local supersonic flow above the wing; a device for improving the transonic pitch up of swept wings with very little added drag at the cruise condition; a means for reducing the large transonic aerodynamic interference between the wing, fuselage, nacelle and pylon for a for a fuselage mounted nacelle having the inlet above the wing; and methods for reducing the transonic interference between flows over a winglet and the wing.

  18. Ligand modeling and design

    SciTech Connect

    Hay, B.

    1996-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used tin applications for the cost-effective removal of specific radionuclides from nuclear waste streams.

  19. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  20. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  1. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates. PMID:27088667

  2. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  3. The MiRa/THESIS3D-code package for resonator design and modeling of millimeter-wave material processing

    SciTech Connect

    Feher, L.; Link, G.; Thumm, M.

    1996-12-31

    Precise knowledge of millimeter-wave oven properties and design studies have to be obtained by 3D numerical field calculations. A simulation code solving the electromagnetic field problem based on a covariant raytracing scheme (MiRa-Code) has been developed. Time dependent electromagnetic field-material interactions during sintering as well as the heat transfer processes within the samples has been investigated. A numerical code solving the nonlinear heat transfer problem due to millimeter-wave heating has been developed (THESIS3D-Code). For a self consistent sintering simulation, a zip interface between both codes exchanging the time advancing fields and material parameters is implemented. Recent results and progress on calculations of field distributions in large overmoded resonators as well as results on modeling heating of materials with millimeter waves are presented in this paper. The calculations are compared to experiments.

  4. WORKSHOP ON ENVIRONMENTALLY CONSCIOUS CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    To encourage the consideration of environmental issues during chemical process design, the USEPA has developed techniques and software tools to evaluate the relative environmental impact of a chemical process. These techniques and tools aid in the risk management process by focus...

  5. Modeling User Interactions with Instructional Design Software.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    As one of a series of studies being conducted to develop a useful (predictive) model of the instructional design process that is appropriate to military technical training settings, this study performed initial evaluations on two pieces of instructional design software developed by M. David Merrill and colleagues at Utah State University i.e.,…

  6. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  7. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  8. INTEGRATED FISCHER TROPSCH MODULAR PROCESS MODEL

    SciTech Connect

    Donna Post Guillen; Richard Boardman; Anastasia M. Gribik; Rick A. Wood; Robert A. Carrington

    2007-12-01

    With declining petroleum reserves, increased world demand, and unstable politics in some of the world’s richest oil producing regions, the capability for the U.S. to produce synthetic liquid fuels from domestic resources is critical to national security and economic stability. Coal, biomass and other carbonaceous materials can be converted to liquid fuels using several conversion processes. The leading candidate for large-scale conversion of coal to liquid fuels is the Fischer Tropsch (FT) process. Process configuration, component selection, and performance are interrelated and dependent on feed characteristics. This paper outlines a flexible modular approach to model an integrated FT process that utilizes a library of key component models, supporting kinetic data and materials and transport properties allowing rapid development of custom integrated plant models. The modular construction will permit rapid assessment of alternative designs and feed stocks. The modeling approach consists of three thrust areas, or “strands” – model/module development, integration of the model elements into an end to end integrated system model, and utilization of the model for plant design. Strand 1, model/module development, entails identifying, developing, and assembling a library of codes, user blocks, and data for FT process unit operations for a custom feedstock and plant description. Strand 2, integration development, provides the framework for linking these component and subsystem models to form an integrated FT plant simulation. Strand 3, plant design, includes testing and validation of the comprehensive model and performing design evaluation analyses.

  9. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  10. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  11. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  12. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... designation regulations to provide for changes in the designation process (76 FR 70368-70374). In general... the comments. Definitions Comment: Removing the list of examples of unusual and adverse weather... disaster as an unusual or severe weather condition or other natural phenomena that causes severe...

  13. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  14. The Ecosystem Model: Designing Campus Environments.

    ERIC Educational Resources Information Center

    Western Interstate Commission for Higher Education, Boulder, CO.

    This document stresses the increasing awareness in higher education of the impact student/environment transactions have upon the quality of educational life and details a model and design process for creating a better fit between educational environments and students. The ecosystem model uses an interdisciplinary approach for the make-up of its…

  15. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy.

  16. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  17. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  18. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  19. Process-based design of dynamical biological systems

    NASA Astrophysics Data System (ADS)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  20. Process-based design of dynamical biological systems

    PubMed Central

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-01-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered. PMID:27686219

  1. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  2. Electromagnetic modeling in accelerator designs

    SciTech Connect

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described.

  3. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi ); Tung, Yuanki )

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  4. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  5. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  6. Fusion Process Model Implementation Case Studies

    NASA Astrophysics Data System (ADS)

    Kaur, Rupinder; Sengupta, Jyotsna

    2012-07-01

    In this paper we have discussed, three case studies. The first one is applied at Web Shrub Solutions, a software development organization, second is applied at web based job portal (stepintojob.com) for leading Indian firm and the third is web design and development for SCL limited, to observe the results of Fusion Process Model. Fusion Process Model follows component driven approach; it applies 3C Model to generalize the process of solving the problem in each phase, which provides firm control over the software development process.

  7. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  8. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  9. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  10. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  11. Teaching sustainable design: A collaborative process

    SciTech Connect

    Theis, C.C.

    1997-12-31

    This paper describes a collaborative educational experience in the Schools of Architecture and Landscape Architecture at Louisiana State University. During the Fall Semester of 1996 an upper-level architectural design studio worked with a peer group of landscape architecture students on the design of a master plan for an environmentally sensitive residential development on Cat Island, a barrier island located approximately eight miles south of Gulfport, Mississippi. This paper presents the methodology and results of the project, describes the collaborative process, and assesses both the viability of the design solutions and the value of the educational experience.

  12. Protein designs in HP models

    NASA Astrophysics Data System (ADS)

    Gupta, Arvind; Khodabakhshi, Alireza Hadj; Maňuch, Ján; Rafiey, Arash; Stacho, Ladislav

    2009-07-01

    The inverse protein folding problem is that of designing an amino acid sequence which folds into a prescribed shape. This problem arises in drug design where a particular structure is necessary to ensure proper protein-protein interactions and could have applications in nanotechnology. A major challenge in designing proteins with native folds that attain a specific shape is to avoid proteins that have multiple native folds (unstable proteins). In this technical note we present our results on protein designs in the variant of Hydrophobic-Polar (HP) model introduced by Dill [6] on 2D square lattice. The HP model distinguishes only polar and hydrophobic monomers and only counts the number of hydrophobic contacts in the energy function. To achieve better stability of our designs we use the Hydrophobic-Polar-Cysteine (HPC) model which distinguishes the third type of monomers called "cysteines" and incorporates also the disulfid bridges (SS-bridges) into the energy function. We present stable designs in 2D square lattice and 3D hexagonal prism lattice in the HPC model.

  13. Process of system design and analysis

    SciTech Connect

    Gardner, B.

    1995-09-01

    The design of an effective physical protection system includes the determination of the physical protection system objectives, the initial design of a physical protection system, the evaluation of the design, and, probably, a redesign or refinement of the system. To develop the objectives, the designer must begin by gathering information about facility operations and conditions, such as a comprehensive description of the facility, operating states, and the physical protection requirements. The designer then needs to define the threat. This involves considering factors about potential adversaries: Class of adversary, adversary`s capabilities, and range of adversary`s tactics. Next, the designer should identify targets. Determination of whether or not nuclear materials are attractive targets is based mainly on the ease or difficulty of acquisition and desirability of the materiaL The designer now knows the objectives of the physical protection system, that is, ``What to protect against whom.`` The next step is to design the system by determining how best to combine such elements as fences, vaults, sensors, procedures, communication devices, and protective force personnel to meet the objectives of the system. Once a physical protection system is designed, it must be analyzed and evaluated to ensure it meets the physical protection objectives. Evaluation must allow for features working together to assure protection rather than regarding each feature separately. Due to the complexity of protection systems, an evaluation usually requires modeling techniques. If any vulnerabilities are found, the initial system must be redesigned to correct the vulnerabilities and a reevaluation conducted.

  14. Global optimization of bilinear engineering design models

    SciTech Connect

    Grossmann, I.; Quesada, I.

    1994-12-31

    Recently Quesada and Grossmann have proposed a global optimization algorithm for solving NLP problems involving linear fractional and bilinear terms. This model has been motivated by a number of applications in process design. The proposed method relies on the derivation of a convex NLP underestimator problem that is used within a spatial branch and bound search. This paper explores the use of alternative bounding approximations for constructing the underestimator problem. These are applied in the global optimization of problems arising in different engineering areas and for which different relaxations are proposed depending on the mathematical structure of the models. These relaxations include linear and nonlinear underestimator problems. Reformulations that generate additional estimator functions are also employed. Examples from process design, structural design, portfolio investment and layout design are presented.

  15. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  16. From Business Value Model to Coordination Process Model

    NASA Astrophysics Data System (ADS)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  17. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  18. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  19. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  20. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  1. Generic Model Host System Design

    SciTech Connect

    Chu, Chungming; Wu, Juhao; Qiang, Ji; Shen, Guobao; /Brookhaven

    2012-06-22

    There are many simulation codes for accelerator modelling; each one has some strength but not all. A platform which can host multiple modelling tools would be ideal for various purposes. The model platform along with infrastructure support can be used not only for online applications but also for offline purposes. Collaboration is formed for the effort of providing such a platform. In order to achieve such a platform, a set of common physics data structure has to be set. Application Programming Interface (API) for physics applications should also be defined within a model data provider. A preliminary platform design and prototype is discussed.

  2. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  3. Non-Linear Instructional Design Model: Eternal, Synergistic Design and Development

    ERIC Educational Resources Information Center

    Crawford, Caroline

    2004-01-01

    Instructional design is at the heart of each educational endeavour. This process revolves around the steps through which the thoughtful productions of superior products are created. The ADDIE generic instructional design model emphasises five basic steps within the instructional design process: analyse, design, develop, implement and evaluate. The…

  4. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  5. Modeling approach for business process reengineering

    NASA Astrophysics Data System (ADS)

    Tseng, Mitchell M.; Chen, Yuliu

    1995-08-01

    The purpose of this paper is to introduce a modeling approach to define, simulate, animate, and control business processes. The intent is to introduce the undergoing methodology to build tools for designing and managing business processes. Similar to computer aided design (CAD) for mechanical parts, CAD tools are needed for designing business processes. It emphasizes the dynamic behavior of business process. The proposed modeling technique consists of a definition of each individual activity, the network of activities, a control mechanism that describes coordination of these activities, and events that will flow through these activities. Based on the formalism introduced in this modeling technique, users will be able to define business process with minimum ambiguity, take snap shots of particular events in the process, describe the accountability of participants, and view a replay of event streams in the process flow. This modeling approach, mapped on top of a commercial software, has been tested by using examples from real life business process. The examples and testing helped us to identify some of the strengths and weaknesses of this proposed approach.

  6. Chemical Process Modeling and Control.

    ERIC Educational Resources Information Center

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  7. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  8. Designing Instruction That Supports Cognitive Learning Processes

    PubMed Central

    Clark, Ruth; Harrelson, Gary L.

    2002-01-01

    Objective: To provide an overview of current cognitive learning processes, including a summary of research that supports the use of specific instructional methods to foster those processes. We have developed examples in athletic training education to help illustrate these methods where appropriate. Data Sources: Sources used to compile this information included knowledge base and oral and didactic presentations. Data Synthesis: Research in educational psychology within the past 15 years has provided many principles for designing instruction that mediates the cognitive processes of learning. These include attention, management of cognitive load, rehearsal in working memory, and retrieval of new knowledge from long-term memory. By organizing instruction in the context of tasks performed by athletic trainers, transfer of learning and learner motivation are enhanced. Conclusions/Recommendations: Scientific evidence supports instructional methods that can be incorporated into lesson design and improve learning by managing cognitive load in working memory, stimulating encoding into long-term memory, and supporting transfer of learning. PMID:12937537

  9. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  10. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare. PMID:22925789

  11. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  12. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  13. Liberating Expression: A Freehand Approach to Business Process Modeling

    NASA Astrophysics Data System (ADS)

    Mangano, Nicolas; Sukaviriya, Noi

    Tools that support business process modeling are designed for experienced users to draw a process with precision and professional appearance. These tools are not conducive to sketching quick business design ideas.This demo proposal presents Inkus, a non-intrusive business process sketching tool which allows freehand sketches of process ideas and slowly brings the users to the required common business vocabulary. Our goal is to help unleash creativity in business designers and enrich the design process with values beyond drawing.

  14. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  15. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons. PMID:21064164

  16. Incorporating manufacturability constraints into the design process of heterogeneous objects

    NASA Astrophysics Data System (ADS)

    Hu, Yuna; Blouin, Vincent Y.; Fadel, Georges M.

    2004-11-01

    Rapid prototyping (RP) technology, such as Laser Engineering Net Shaping (LENSTM), can be used to fabricate heterogeneous objects with gradient variations in material composition. These objects are generally characterized by enhanced functional performance. Past research on the design of such objects has focused on representation, modeling, and functional performance. However, the inherent constraints in RP processes, such as system capability and processing time, lead to heterogeneous objects that may not meet the designer's original intent. To overcome this situation, the research presented in this paper focuses on the identification and implementation of manufacturing constraints into the design process. A node-based finite element modeling technique is used for the representation and analysis and the multicriteria design problem corresponds to finding the nodal material compositions that minimize structural weight and maximize thermal performance. The optimizer used in this research is a real-valued Evolutionary Strategies (ES), which is well suited for this type of multi-modal problem. Two limitations of the LENS manufacturing process, which have an impact on the design process, are identified and implemented. One of them is related to the manufacturing time, which is considered as an additional criterion to be minimized in the design problem for a preselected tool path. A brake disc rotor made of two materials, aluminum for lightweight and steel for superior thermal characteristics, is used to illustrate the tradeoff between manufacturability and functionality.

  17. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  18. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  19. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  20. Mimicry of natural material designs and processes

    SciTech Connect

    Bond, G.M.; Richman, R.H.; McNaughton, W.P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  1. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  2. Designing and encoding models for synthetic biology

    PubMed Central

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-01-01

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology ‘loop’. PMID:19364720

  3. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  4. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  5. Air stripping VOCs from groundwater: Process design considerations

    SciTech Connect

    Ball, B.R.; Edwards, M.D. )

    1992-02-01

    Considerations for evaluating and designing the air stripping process are presented by case study. The case study involves the design of an air stripping process to remediate groundwater contaminated with volatile organic compounds (VOCs) at a National Priorities List site in Tacoma, WA. Design objectives included developing a tower with minimum volume and energy requirements while complying with discharge air and water quality standards. A two-phase resistance model using Onda Correlations to determine liquid- and gas-phase mass transfer coefficients was used to assist in the evaluation and design. Considerations for applying the two-phase resistance model to air stripping tower design are presented. The ability of the model to simulate process performance is demonstrated by comparison with actual data for 11 priority pollutant list VOCs evaluated during an onsite pilot study. Design procedures with which to develop a tower with minimum volume and energy requirements are described. Other considerations involving the evaluation of VOC emissions and the precipitation and buildup of inorganic constituents within the internal packing media are described.

  6. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  7. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  8. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  9. A formulation of metamodel implementation processes for complex systems design

    NASA Astrophysics Data System (ADS)

    Daberkow, Debora Daniela

    Complex systems design poses an interesting as well as demanding information management problem for system level integration and design. The high interconnectivity of disciplines combined with the specific knowledge and expertise in each of these calls for a system level view that is broad, as in spanning across all disciplines, while at the same time detailed enough to do the disciplinary knowledge justice. The treatment of this requires highly evolved information management and decision approaches, which result in design methodologies that can handle this high degree of complexity. The solution is to create models within the design process, which predict meaningful metrics representative of the various disciplinary analyses that can be quickly evaluated and thus serve in system level decision making and optimization. Such models approximate the physics-based analysis codes used in each of the disciplines and are called metamodels since effectively, they model the (physics-based) models on which the disciplinary analysis codes are based. The thesis formulates a new metamodel implementation process to be used in complex systems design, utilizing a Gaussian Process prediction method. It is based on a Bayesian probability and inference approach and as such returns a variance prediction along with the most likely value, thus giving an estimate also for the confidence in the prediction. Within this thesis, the applicability and appropriateness at the theoretical as well as practical level are investigated, and proof-of-concept implementations at the disciplinary and system levels are provided.

  10. Process Paradigms in Design and Composition: Affinities and Directions.

    ERIC Educational Resources Information Center

    Kostelnick, Charles

    1989-01-01

    Argues that comparing developments in the process approach to writing and the design methods movement sheds light on the evolution and future direction of the writing paradigm. Argues that sensitivity to the variety of writing tasks and social contexts is more effective than a single amorphous model. (RS)

  11. Rapid Modeling, Assembly and Simulation in Design Optimization

    NASA Technical Reports Server (NTRS)

    Housner, Jerry

    1997-01-01

    A new capability for design is reviewed. This capability provides for rapid assembly of detail finite element models early in the design process where costs are most effectively impacted. This creates an engineering environment which enables comprehensive analysis and design optimization early in the design process. Graphical interactive computing makes it possible for the engineer to interact with the design while performing comprehensive design studies. This rapid assembly capability is enabled by the use of Interface Technology, to couple independently created models which can be archived and made accessible to the designer. Results are presented to demonstrate the capability.

  12. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  13. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  14. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  15. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  16. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  17. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  18. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  19. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  20. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  1. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  2. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  3. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  4. Understanding backward design to strengthen curricular models.

    PubMed

    Emory, Jan

    2014-01-01

    Nurse educators have responded to the call for transformation in education. Challenges remain in planning curricular implementation to facilitate understanding of essential content for student success on licensure examinations and in professional practice. The conceptual framework Backward Design (BD) can support and guide curriculum decisions. Using BD principles in conjunction with educational models can strengthen and improve curricula. This article defines and describes the BD process, and identifies reported benefits for nursing education. PMID:24743175

  5. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  6. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  7. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  8. Computer aided microbial safety design of food processes.

    PubMed

    Schellekens, M; Martens, T; Roberts, T A; Mackey, B M; Nicolaï, B M; Van Impe, J F; De Baerdemaeker, J

    1994-12-01

    To reduce the time required for product development, to avoid expensive experimental tests, and to quantify safety risks for fresh products and the consequence of processing there is a growing interest in computer aided food process design. This paper discusses the application of hybrid object-oriented and rule-based expert system technology to represent the data and knowledge of microbial experts and food engineers. Finite element models for heat transfer calculation routines, microbial growth and inactivation models and texture kinetics are combined with food composition data, thermophysical properties, process steps and expert knowledge on type and quantity of microbial contamination. A prototype system has been developed to evaluate changes in food composition, process steps and process parameters on microbiological safety and textual quality of foods.

  9. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  10. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  11. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  12. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  13. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  14. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  15. More details...
  16. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  17. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  18. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  19. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  20. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  21. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is the…

  1. Design of the HTGR for process heat applications

    SciTech Connect

    Vrable, D.L.; Quade, R.N.

    1980-05-01

    This paper discusses a design study of an advanced 842-MW(t) HTGR with a reactor outlet temperature of 850/sup 0/C (1562/sup 0/F), coupled with a chemical process whose product is hydrogen (or a mixture of hydrogen and carbon monoxide) generated by steam reforming of a light hydrocarbon mixture. This paper discusses the plant layout and design for the major components of the primary and secondary heat transfer systems. Typical parametric system study results illustrate the capability of a computer code developed to model the plant performance and economics.

  2. Kinetic and Modeling Investigation to Provide Design Guidelines for the NREL Dilute-Acid Process Aimed at Total Hydrolysis/Fractionation of Lignocellulosic Biomass: July 1998

    SciTech Connect

    Lee, Y. Y.; Iyer, P.; Xiang, Q.; Hayes, J.

    2004-08-01

    Following up on previous work, subcontractor investigated three aspects of using NREL ''pretreatment'' technology for total hydrolysis (cellulose as well as hemicellulose) of biomass. Whereas historic hydrolysis of biomass used either dilute acid or concentrated acid technology for hydrolysis of both hemicellulose and cellulose, NREL has been pursuing very dilute acid hydrolysis of hemicellulose followed by enzymatic hydrolysis of cellulose. NREL's countercurrent shrinking-bed reactor design for hemicellulose hydrolysis (pretreatment) has, however, shown promise for total hydrolysis. For the first task, subcontractor developed a mathematical model of the countercurrent shrinking bed reactor operation and, using yellow poplar sawdust as a feedstock, analyzed the effect of: initial solid feeding rate, temperature, acid concentration, acid flow rate, Peclet number (a measure of backmixing in liquid flow), and bed shrinking. For the second task, subcontractor used laboratory trials, with yellow poplar sawdust and 0.07 wt% sulfuric acid at various temperatures, to verify the hydrolysis of cellulose to glucose (desired) and decomposition of glucose (undesired) and determine appropriate parameters for use in kinetic models. Unlike cellulose and hemicellulose, lignins, the third major component of biomass, are not carbohydrates that can be broken down into component sugars. They are, however, aromatic complex amorphous phenolic polymers that can likely be converted into low-molecular weight compounds suitable for production of fuels and chemicals. Oxidative degradation is one pathway for such conversion and hydrogen peroxide would be an attractive reagent for this, as it would leave no residuals. For the third task, subcontractor reacted lignin with hydrogen peroxide under various conditions and analyzed the resulting product mix.

  3. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  4. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  5. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  6. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  7. Theoretical Models of Astrochemical Processes

    NASA Technical Reports Server (NTRS)

    Charnley, Steven

    2009-01-01

    Interstellar chemistry provides a natural laboratory for studying exotic species and processes at densities, temperatures, and reaction rates. that are difficult or impractical to address in the laboratory. Thus, many chemical reactions considered too sloe by the standards of terrestrial chemistry, can be 'observed and modeled. Curious proposals concerning the nature and chemistry of complex interstellar organic molecules will be described. Catalytic reactions on "rain surfaces can, in principle, lead to a lame variety of species and this has motivated many laboratory and theoretical studies. Gas phase processes may also build lame species in molecular clouds. Future laboratory data and computational tools needed to construct accurate chemical models of various astronomical sources to be observed by Herschel and ALMA will be outlined.

  8. Process variation analysis for MEMS design

    NASA Astrophysics Data System (ADS)

    Schenato, Luca; Wu, Wei-Chung; El Ghaoui, Laurent; Pister, Kristofer S. J.

    2001-03-01

    Process variations, incurred during the fabrication stage of MEMS structures, may lead to substantially different performance than the nominal one. This is mainly due to the small variation of the geometry of the structure with respect to the ideal design. In this paper we propose an approach to estimate performance variations for general planar suspended MEMS structure for low frequency applications. This approach is based on two complementary techniques, one probabilistic and the other deterministic. The former technique, based on the Monte-Carlo method, defines a random distribution on the geometric variables and evaluates the possible outcome performance by sampling that distribution. The latter technique, based on robust optimization and semidefinite programming (SDP) approximations te{EOL:98}, finds bounds on performance parameters given the bounds on the geometric variables, i.e. it considers the worst case scenario. Both techniques have been integrated with SUGAR, a simulation tool for MEMS devices available to the public te{Zhou98} te{Sito}, and tested on different types of folded springs.

  9. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  10. A Constructivist Design and Learning Model: Time for a Graphic.

    ERIC Educational Resources Information Center

    Rogers, Patricia L.; Mack, Michael

    At the University of Minnesota, a model, visual representation or "graphic" that incorporated both a systematic design process and a constructivist approach was used as a framework for course design. This paper describes experiences of applying the Instructional Context Design (ICD) framework in both the K-12 and higher education settings. The…

  11. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  12. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  13. Model-Based Design of Biochemical Microreactors

    PubMed Central

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M.; Voll, Lars M.; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  14. Model-Based Design of Biochemical Microreactors.

    PubMed

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  15. Multimedia Learning Design Pedagogy: A Hybrid Learning Model

    ERIC Educational Resources Information Center

    Tsoi, Mun Fie; Goh, Ngoh Khang; Chia, Lian Sai

    2005-01-01

    This paper provides insights on a hybrid learning model for multimedia learning design conceptualized from the Piagetian science learning cycle model and the Kolb's experiential learning model. This model represents learning as a cognitive process in a cycle of four phases, namely, Translating, Sculpting, Operationalizing, and Integrating and is…

  16. Sensory processing and world modeling for an active ranging device

    NASA Technical Reports Server (NTRS)

    Hong, Tsai-Hong; Wu, Angela Y.

    1991-01-01

    In this project, we studied world modeling and sensory processing for laser range data. World Model data representation and operation were defined. Sensory processing algorithms for point processing and linear feature detection were designed and implemented. The interface between world modeling and sensory processing in the Servo and Primitive levels was investigated and implemented. In the primitive level, linear features detectors for edges were also implemented, analyzed and compared. The existing world model representations is surveyed. Also presented is the design and implementation of the Y-frame model, a hierarchical world model. The interfaces between the world model module and the sensory processing module are discussed as well as the linear feature detectors that were designed and implemented.

  17. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  18. Development of an Integrated Process, Modeling and Simulation Platform for Performance-Based Design of Low-Energy and High IEQ Buildings

    ERIC Educational Resources Information Center

    Chen, Yixing

    2013-01-01

    The objective of this study was to develop a "Virtual Design Studio (VDS)": a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. The VDS is intended to assist collaborating architects,…

  19. Making designer mutants in model organisms.

    PubMed

    Peng, Ying; Clark, Karl J; Campbell, Jarryd M; Panetta, Magdalena R; Guo, Yi; Ekker, Stephen C

    2014-11-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.

  20. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  1. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  2. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  3. Space bioreactor: Design/process flow

    NASA Technical Reports Server (NTRS)

    Cross, John H.

    1987-01-01

    The design of the space bioreactor stems from three considerations. First, and foremost, it must sustain cells in microgravity. Closely related is the ability to take advantage of the weightlessness and microgravity. Lastly, it should fit into a bioprocess. The design of the space bioreactor is described in view of these considerations. A flow chart of the bioreactor is presented and discussed.

  4. Course Design Using an Authentic Studio Model

    ERIC Educational Resources Information Center

    Wilson, Jay R.

    2013-01-01

    Educational Technology and Design 879 is a graduate course that introduces students to the basics of video design and production. In an attempt to improve the learning experience for students a redesign of the course was implemented for the summer of 2011 that incorporated an authentic design studio model. The design studio approach is based on…

  5. ArF processing of 90-nm design rule lithography achieved through enhanced thermal processing

    NASA Astrophysics Data System (ADS)

    Kagerer, Markus; Miller, Daniel; Chang, Wayne; Williams, Daniel J.

    2006-03-01

    As the lithography community has moved to ArF processing on 300 mm wafers for 90 nm design rules the process characterization of the components of variance continues to highlight the thermal requirements for the post exposure bake (PEB) processing step. In particular as the thermal systems have become increasingly uniform, the transient behavior of the thermal processing system has received the focus of attention. This paper demonstrates how a newly designed and patented thermal processing system was optimized for delivering improved thermal uniformity during a typical 90 second PEB processing cycle, rather than being optimized for steady state performance. This was accomplished with the aid of a wireless temperature measurement wafer system for obtaining real time temperature data and by using a response surface model (RSM) experimental design for optimizing parameters of the temperature controller of the thermal processing system. The new units were field retrofitted seamlessly in <2 days at customer sites without disruption to process recipes or flows. After evaluating certain resist parameters such as PEB temperature sensitivity and post exposure delay (PED) - stability of the baseline process, the new units were benchmarked against the previous PEB plates by processing a split lot experiment. Additional hardware characterization included environmental factors such as air velocity in the vicinity of the PEB plates and transient time between PEB and chill plate. At the completion of the optimization process, the within wafer CD uniformity displayed a significant improvement when compared to the previous hardware. The demonstrated within wafer CD uniformity improved by 27% compared to the initial hardware and baseline process. ITRS requirements for the 90 nm node were exceeded.

  6. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    SciTech Connect

    Currier, R.P.

    1994-10-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported.

  7. Instructional Design Models: What a Revolution!

    ERIC Educational Resources Information Center

    Faryadi, Qais

    2007-01-01

    This review examines instructional design models and the construction of knowledge. It further explores to identify the chilling benefits of these models for the inputs and outputs of knowledge transfer. This assessment also attempts to define instructional design models through the eyes and the minds of renowned scholars as well as the most…

  8. The PIC [Process Individualization Curriculum] Model: Structure with Humanistic Goals.

    ERIC Educational Resources Information Center

    Gow, Doris T.

    This paper describes a curriculum design model to train research and development personnel under USOE-NIE funding. This design model, called PIC (Process Individualization Curriculum), was chosen for coverting on-campus courses to extra-mural self-instructional courses. The curriculum specialists who work with professors to individualize their…

  9. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  10. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. PMID:27292581

  11. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  12. Micromachined accelerometer design, modeling and validation

    SciTech Connect

    Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

    1998-04-01

    Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

  13. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 2: Combustion of brown coal from the Kansk-Achinsk Basin in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-03-01

    This paper continues with the description of study results for an improved-design steam boiler vortex furnace, for the full-scale configuration of which the numerical modeling of a three-dimensional turbulent two-phase reacting flow has been performed with allowance for all the principal heat and mass transfer processes in the torch combustion of pulverized Berezovsk brown coal from the Kansk-Achinsk Basin. The detailed distributions of velocity, temperature, concentration, and heat flux fields in different cross sections of the improved vortex furnace have been obtained. The principal thermoengineering and environmental characteristics of this furnace are given.

  14. Hotspot detection and design recommendation using silicon calibrated CMP model

    NASA Astrophysics Data System (ADS)

    Hui, Colin; Wang, Xian Bin; Huang, Haigou; Katakamsetty, Ushasree; Economikos, Laertis; Fayaz, Mohammed; Greco, Stephen; Hua, Xiang; Jayathi, Subramanian; Yuan, Chi-Min; Li, Song; Mehrotra, Vikas; Chen, Kuang Han; Gbondo-Tugbawa, Tamba; Smith, Taber

    2009-03-01

    Chemical Mechanical Polishing (CMP) has been used in the manufacturing process for copper (Cu) damascene process. It is well known that dishing and erosion occur during CMP process, and they strongly depend on metal density and line width. The inherent thickness and topography variations become an increasing concern for today's designs running through advanced process nodes (sub 65nm). Excessive thickness and topography variations can have major impacts on chip yield and performance; as such they need to be accounted for during the design stage. In this paper, we will demonstrate an accurate physics based CMP model and its application for CMP-related hotspot detection. Model based checking capability is most useful to identify highly environment sensitive layouts that are prone to early process window limitation and hence failure. Model based checking as opposed to rule based checking can identify more accurately the weak points in a design and enable designers to provide improved layout for the areas with highest leverage for manufacturability improvement. Further, CMP modeling has the ability to provide information on interlevel effects such as copper puddling from underlying topography that cannot be captured in Design-for- Manufacturing (DfM) recommended rules. The model has been calibrated against the silicon produced with the 45nm process from Common Platform (IBMChartered- Samsung) technology. It is one of the earliest 45nm CMP models available today. We will show that the CMP-related hotspots can often occur around the spaces between analog macros and digital blocks in the SoC designs. With the help of the CMP model-based prediction, the design, the dummy fill or the placement of the blocks can be modified to improve planarity and eliminate CMP-related hotspots. The CMP model can be used to pass design recommendations to designers to improve chip yield and performance.

  15. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  16. Designing control system information models

    NASA Technical Reports Server (NTRS)

    Panin, K. I.; Zinchenko, V. P.

    1973-01-01

    Problems encountered in modeling information models are discussed, Data cover condition, functioning of the object of control, and the environment involved in the control. Other parameters needed for the model include: (1) information for forming an image of the real situation, (2) data for analyzing and evaluating an evolving situation, (3) planning actions, and (4) data for observing and evaluating the results of model realization.

  17. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  18. Knowledge and Processes in Design. DPS Final Report.

    ERIC Educational Resources Information Center

    Pirolli, Peter

    Four papers from a project concerning information-processing characterizations of the knowledge and processes involved in design are presented. The project collected and analyzed verbal protocols from instructional designers, architects, and mechanical engineers. A framework was developed for characterizing the problem spaces of design that…

  19. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  20. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  1. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  2. Computational models of natural language processing

    SciTech Connect

    Bara, B.G.; Guida, G.

    1984-01-01

    The main concern in this work is the illustration of models for natural language processing, and the discussion of their role in the development of computational studies of language. Topics covered include the following: competence and performance in the design of natural language systems; planning and understanding speech acts by interpersonal games; a framework for integrating syntax and semantics; knowledge representation and natural language: extending the expressive power of proposition nodes; viewing parsing as word sense discrimination: a connectionist approach; a propositional language for text representation; from topic and focus of a sentence to linking in a text; language generation by computer; understanding the Chinese language; semantic primitives or meaning postulates: mental models of propositional representations; narrative complexity based on summarization algorithms; using focus to constrain language generation; and towards an integral model of language competence.

  3. Optimality criteria design and stress constraint processing

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1982-01-01

    Methods for pre-screening stress constraints into either primary or side-constraint categories are reviewed; a projection method, which is developed from prior cycle stress resultant history, is introduced as an additional screening parameter. Stress resultant projections are also employed to modify the traditional stress-ratio, side-constraint boundary. A special application of structural modification reanalysis is applied to the critical stress constraints to provide feasible designs that are preferable to those obtained by conventional scaling. Sample problem executions show relatively short run times and fewer design cycle iterations to achieve low structural weights; those attained are comparable to the minimum values developed elsewhere.

  4. Erlang Behaviours: Programming with Process Design Patterns

    NASA Astrophysics Data System (ADS)

    Cesarini, Francesco; Thompson, Simon

    Erlang processes run independently of each other, each using separate memory and communicating with each other by message passing. These processes, while executing different code, do so following a number of common patterns. By examining different examples of Erlang-style concurrency in client/server architectures, we identify the generic and specific parts of the code and extract the generic code to form a process skeleton. In Erlang, the most commonly used patterns have been implemented in library modules, commonly referred to as OTP behaviours. They contain the generic code framework for concurrency and error handling, simplifying the complexity of concurrent programming and protecting the developer from many common pitfalls.

  5. Development of a dynamic thermal model process

    SciTech Connect

    Smith, F. R.

    1996-04-01

    A dynamic electrical-thermal modeling simulation technique was developed to allow up-front design of thermal and electronic packaging with a high degree of accuracy and confidence. We are developing a hybrid multichip module output driver which controls with power MOSFET driver circuits. These MOSFET circuits will dissipate from 13 to 26 watts per driver in a physical package less than two square inches. The power dissipation plus an operating temperature range of -55{degrees} C to 100{degrees} C makes an accurate thermal package design critical. The project goal was to develop a simulation process to dynamically model the electrical/thermal characteristics of the power MOSFETS using the SABER analog simulator and the ABAQUS finite element simulator. SABER would simulate the electrical characteristics of the multi-chip module design while co-simulation is being done with ABAQUS simulating the solid model thermal characteristics of the MOSFET package. The dynamic parameters, MOSFET power and chip temperature, would be actively passed between simulators to effect a coupled simulator modelling technique. The project required a development of a SABER late for the analog ASIC controller circuit, a dynamic electrical/thermal template for the IRF150 and IRF9130 power MOSFETs, a solid model of the multi-chip module package, FORTRAN code to handle I/Q between and HP755 workstation and SABER, and I/O between CRAY J90 computer and ABAQUS. The simulation model was certified by measured electrical characteristics of the circuits and real time thermal imaging of the output multichip module.

  6. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught at…

  7. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  8. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  9. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed. PMID:23920835

  10. Biochemical Engineering. Part II: Process Design

    ERIC Educational Resources Information Center

    Atkinson, B.

    1972-01-01

    Describes types of industrial techniques involving biochemical products, specifying the advantages and disadvantages of batch and continuous processes, and contrasting biochemical and chemical engineering. See SE 506 318 for Part I. (AL)

  11. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  12. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Video Gallery

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  13. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  14. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  15. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  16. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  17. Enhancing the support of interdisciplinary product design by using design object-oriented modelling

    SciTech Connect

    Yan, Xiu-Tian; MacCallum, K.J.

    1996-12-31

    This paper addresses the modelling difficulties faced by an designer in the development of new interdisciplinary products. It describes a novel approach to tackling these problems. The underlying methodology is based on the object-oriented technology. A unified design object based representation and modelling method is proposed to enhance the scope of product modelling to many phases of the design process, and to simplify the modelling complexity associated with the increased number of modelling methods directly adopted from individual disciplines. This unified modelling representation method encompasses several low level modelling methods and it can integrate the dynamic energy and information system modelling of mechatronic products. These design object models will greatly facilitate designers, especially those who work on the development of interdisciplinary products, such as mechatronic systems. The paper concludes that a design object-oriented modelling method based on a hybrid modelling representation encompassing bond graph notation, block diagram, and Yourdon diagram, is desirable and feasible for mechatronic system design and modelling.

  18. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  19. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  20. Organizational Learning and Product Design Management: Towards a Theoretical Model.

    ERIC Educational Resources Information Center

    Chiva-Gomez, Ricardo; Camison-Zornoza, Cesar; Lapiedra-Alcami, Rafael

    2003-01-01

    Case studies of four Spanish ceramics companies were used to construct a theoretical model of 14 factors essential to organizational learning. One set of factors is related to the conceptual-analytical phase of the product design process and the other to the creative-technical phase. All factors contributed to efficient product design management…

  1. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  2. Computer Modeling and Visualization in Design Technology: An Instructional Model.

    ERIC Educational Resources Information Center

    Guidera, Stan

    2002-01-01

    Design visualization can increase awareness of issues related to perceptual and psychological aspects of design that computer-assisted design and computer modeling may not allow. A pilot university course developed core skills in modeling and simulation using visualization. Students were consistently able to meet course objectives. (Contains 16…

  3. Algorithmic Processes for Increasing Design Efficiency.

    ERIC Educational Resources Information Center

    Terrell, William R.

    1983-01-01

    Discusses the role of algorithmic processes as a supplementary method for producing cost-effective and efficient instructional materials. Examines three approaches to problem solving in the context of developing training materials for the Naval Training Command: application of algorithms, quasi-algorithms, and heuristics. (EAO)

  4. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  5. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  6. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  7. Computer modeling of complete IC fabrication process

    NASA Astrophysics Data System (ADS)

    Dutton, Robert W.

    1987-05-01

    The development of fundamental algorithms for process and device modeling as well as novel integration of the tools for advanced Integrated Circuit (IC) technology design is discussed. The development of the first complete 2D process simulator, SUPREM 4, is reported. The algorithms are discussed as well as application to local-oxidation and extrinsic diffusion conditions which occur in CMOS AND BiCMOS technologies. The evolution of 1D (SEDAN) and 2D (PISCES) device analysis is discussed. The application of SEDAN to a variety of non-silicon technologies (GaAs and HgCdTe) are considered. A new multi-window analysis capability for PISCES which exploits Monte Carlo analysis of hot carriers has been demonstrated and used to characterize a variety of silicon MOSFET and GaAs MESFET effects. A parallel computer implementation of PISCES has been achieved using a Hypercube architecture. The PISCES program has been used for a range of important device studies including: latchup, analog switch analysis, MOSFET capacitance studies and bipolar transient device for ECL gates. The program is broadly applicable to RAM and BiCMOS technology analysis and design. In the analog switch technology area this research effort has produced a variety of important modeling and advances.

  8. Fiber optic sensor design for chemical process and environmental monitoring

    NASA Astrophysics Data System (ADS)

    Mahendran, R. S.; Harris, D.; Wang, L.; Machavaram, V. R.; Chen, R.; Kukureka, St. N.; Fernando, G. F.

    2007-07-01

    Cure monitoring is a term that is used to describe the cross-linking reactions in a thermosetting resin system. Advanced fiber reinforced composites are being used increasingly in a number of industrial sectors including aerospace, marine, sport, automotive and civil engineering. There is a general realization that the processing conditions that are used to manufacture the composites can have a major influence on its hot-wet mechanical properties. This paper is concerned with the design and demonstration of a number of sensor designs for in-situ cure monitoring of a model thermosetting resin system. Simple fixtures were constructed to enable a pair of cleaved optical fibers with a defined gap between the end-faces to be held in position. The resin system was introduced into this gap and the cure kinetics were followed by transmission infrared spectroscopy. A semi-empirical model was used to describe the cure process using the data obtained at different cure temperatures. The same sensor system was used to detect the ingress of moisture in the cured resin system.

  9. Thermoplastics as engineering materials: The mechanics, materials, design, processing link

    SciTech Connect

    Stokes, V.K.

    1995-10-01

    While the use of plastics has been growing at a significant pace because of weight reduction, ease of fabrication of complex shapes, and cost reduction resulting from function integration, the engineering applications of plastics have only become important in the past fifteen years. An inadequate understanding of the mechanics issues underlying the close coupling among the design, the processing (fabrication), and the assembly with these materials is a barrier to their use in structural applications. Recent progress on some issues relating to the engineering uses of plastics is surveyed, highlighting the need for a better understanding of plastics and how processing affects the performance of plastic parts. Topics addressed include the large deformation behavior of ductile resins, fiber orientation in chopped-fiber filled materials, structural foams, random glass mat composites, modeling of thickness distributions in blow-molded and thermoformed parts, dimensional stability (shrinkage, warpage, and residual stresses) in injection-molded parts, and welding of thermoplastics.

  10. A Model-Based Expert System For Digital Systems Design

    NASA Astrophysics Data System (ADS)

    Wu, J. G.; Ho, W. P. C.; Hu, Y. H.; Yun, D. Y. Y.; Parng, T. M.

    1987-05-01

    In this paper, we present a model-based expert system for automatic digital systems design. The goal of digital systems design is to generate a workable and efficient design from high level specifications. The formalization of the design process is a necessity for building an efficient automatic CAD system. Our approach combines model-based, heuristic best-first search, and meta-planning techniques from AI to facilitate the design process. The design process is decomposed into three subprocesses. First, the high-level behavioral specifications are translated into sequences of primitive behavioral operations. Next, primitive operations are grouped to form intermediate-level behavioral functions. Finally, structural function modules are selected to implement these functions. Using model-based reasoning on the primitive behavioral operations level extends the solution space considered in design and provides more opportunity for minimization. Heuristic best-first search and meta-planning tech-niques control the decision-making in the latter two subprocesses to optimize the final design. They also facilitate system maintenance by separating design strategy from design knowledge.

  11. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  12. Analog modelling of obduction processes

    NASA Astrophysics Data System (ADS)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  13. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  14. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  15. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  16. Integrating ergonomics in design processes: a case study within an engineering consultancy firm.

    PubMed

    Sørensen, Lene Bjerg; Broberg, Ole

    2012-01-01

    This paper reports on a case study within an engineering consultancy firm, where engineering designers and ergonomists were working together on the design of a new hospital sterile processing plant. The objective of the paper is to gain a better understanding of the premises for integrating ergonomics into engineering design processes and how different factors either promote or limit the integration. Based on a grounded theory approach a model illustrating these factors is developed and different hypotheses about how these factors either promote and/or limit the integration of ergonomics into design processes is presented along with the model.

  17. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  18. Making designer mutants in model organisms

    PubMed Central

    Peng, Ying; Clark, Karl J.; Campbell, Jarryd M.; Panetta, Magdalena R.; Guo, Yi; Ekker, Stephen C.

    2014-01-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms. PMID:25336735

  19. Dynamic simulation for IGCC process and control design

    SciTech Connect

    Depew, C.; Martinez, A.; Collodi, G.; Meloni, R.

    1998-01-01

    Detailed dynamic simulation analysis is a valuable tool that increases the understanding of unit interactions and control system performance in a complex integrated gasification combined-cycle (IGCC) plant. The Sarlux integrated gasification combined cycle (IGCC) plant must simultaneously satisfy electrical power and refinery hydrogen and steam demands (trigeneration gasification). The plant`s gasifier, heat recovery, sulfur removal, hydrogen recovery and steam power generation units are highly integrated and require coordinated control. In this study, dynamic simulation provides insights into the behavior of the process and combined cycle units during normal and upset conditions. The dynamic simulation is used to design a control system that drives the gasifiers to satisfy power, steam and hydrogen demands before a load change or upset is detected by the syngas pressure controller. At the study conclusion, the model will demonstrate how the IGCC plant will respond to the contractual maximum load change rate and process upsets. The study tests the basic process and control system design during the project engineering phase to minimize startup troubleshooting and expensive field changes.

  20. Using the common sense model to design interventions for the prevention and management of chronic illness threats: from description to process.

    PubMed

    McAndrew, Lisa M; Musumeci-Szabó, Tamara J; Mora, Pablo A; Vileikyte, Loretta; Burns, Edith; Halm, Ethan A; Leventhal, Elaine A; Leventhal, Howard

    2008-05-01

    In this article, we discuss how one might use the common sense model of self-regulation (CSM) for developing interventions for improving chronic illness management. We argue that features of that CSM such as its dynamic, self-regulative (feedback) control feature and its system structure provide an important basis for patient-centered interventions. We describe two separate, ongoing interventions with patients with diabetes and asthma to demonstrate the adaptability of the CSM. Finally, we discuss three additional factors that need to be addressed before planning and implementing interventions: (1) the use of top-down versus bottom-up intervention strategies; (2) health care interventions involving multidisciplinary teams; and (3) fidelity of implementation for tailored interventions.

  1. Operational concepts and implementation strategies for the design configuration management process.

    SciTech Connect

    Trauth, Sharon Lee

    2007-05-01

    This report describes operational concepts and implementation strategies for the Design Configuration Management Process (DCMP). It presents a process-based systems engineering model for the successful configuration management of the products generated during the operation of the design organization as a business entity. The DCMP model focuses on Pro/E and associated activities and information. It can serve as the framework for interconnecting all essential aspects of the product design business. A design operation scenario offers a sense of how to do business at a time when DCMP is second nature within the design organization.

  2. Numerical modelling of circulation and dispersion processes in Boulogne-sur-Mer harbour (Eastern English Channel): sensitivity to physical forcing and harbour design

    NASA Astrophysics Data System (ADS)

    Jouanneau, Nicolas; Sentchev, Alexei; Dumas, Franck

    2013-12-01

    The MARS-3D model in conjunction with the particle tracking module Ichthyop is used to study circulation and tracer dynamics under a variety of forcing conditions in the eastern English Channel, and in the Boulogne-sur-Mer harbour (referred to hereafter as BLH). Results of hydrodynamic modelling are validated against the tidal gauge data, VHF radar surface velocities and ADCP measurements. Lagrangian tracking experiments are performed with passive particles to study tracer dispersal along the northern French coast, with special emphasis on the BLH. Simulations revealed an anticyclonic eddy generated in the harbour at rising tide. Tracers, released during flood tide at the Liane river mouth, move northward with powerful clockwise rotating current. After the high water, the current direction changes to westward, and tracers leave the harbour through the open boundary. During ebb tide, currents convergence along the western open boundary but no eddy is formed, surface currents inside the harbour are much weaker and the tracer excursion length is small. After the current reversal at low water, particles are advected shoreward resulting in a significant increase of the residence time of tracers released during ebb tide. The effect of wind on particle dispersion was found to be particularly strong. Under strong SW wind, the residence time of particles released during flood tide increases from 1.5 to 6 days. For release during ebb tide, SW wind weakens the southward tidally induced drift and thus the residence time decreases. Similar effects are observed when the freshwater inflow to the harbour is increased from 2 to 10 m3/s during the ebb tide flow. For flood tide conditions, the effect of freshwater inflow is less significant. We also demonstrate an example of innovative coastal management targeted at the reduction of the residence time of the pathogenic material accidentally released in the harbour.

  3. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  4. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  5. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined.

  6. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined. PMID:21624834

  7. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  8. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  9. Xylose fermentation: Analysis, modelling, and design

    SciTech Connect

    Slininger, P.J.W.

    1988-01-01

    Ethanolic fermentation is a means of utilizing xylose-rich industrial wastes, but an optimized bioprocess is lacking. Pachysolen tannophilus NRRL Y-7124 was the first yeast discovered capable of significant ethanol production from xylose and has served as a model for studies of other yeasts mediating this conversion. However, a comparative evaluation of strains led the authors to focus on Pichia stipitis NRRL Y-7124 as the yeast with highest potential for application. Given 150 g/l xylose in complex medium, strain Y-7124 functioned optimally at 25-26C pH 4-7 to accumulate 56 g/l ethanol with negligible xylitol production. Dissolved oxygen concentration was critical to cell growth; and in order to measure it accurately, a colorimetric assay was developed to allow calibration of electrodes based on oxygen solubility in media of varying composition. Specific growth rate was a Monod function of limiting substrate concentration (oxygen and/or xylose). Both specific ethanol productivity and oxygen uptake rate were growth-associated, but only the former was maintenance-associated. Both growth and fermentation were inhibited by high xylose and ethanol concentrations. Carbon and cofactor balances supported modelling xylose metabolism as a combination of four processes: assimilation, pentose phosphate oxidation, respiration, and ethanolic fermentation. A mathematical model describing the stoichiometry and kinetics was constructed, and its predictive capacity was confirmed by comparing simulated and experimental batch cultures. Consideration of example processes indicated that this model constitutes an important tool for designing the optimum bioprocess for utilizing xylose-rich wastes.

  10. Space Station Freedom natural environment design models

    NASA Technical Reports Server (NTRS)

    Suggs, Robert M.

    1993-01-01

    The Space Station Freedom program has established a series of natural environment models and databases for utilization in design and operations planning activities. The suite of models and databases that have either been selected from among internationally recognized standards or developed specifically for spacecraft design applications are presented. The models have been integrated with an orbit propagator and employed to compute environmental conditions for planned operations altitudes of Space Station Freedom.

  11. A Review on Mathematical Modeling for Textile Processes

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, R.

    2015-10-01

    Mathematical model is a powerful tool in engineering for studying variety of problems related to design and development of products and processes, optimization of manufacturing process, understanding a phenomenon and predicting product's behaviour in actual use. An insight of the process and use of appropriate mathematical tools are necessary for developing models. In the present paper, a review of types of model, procedure followed in developing them and their limitations have been discussed. Modeling techniques being used in few textile processes available in the literature have been cited as examples.

  12. Using Storyboards to Integrate Models and Informal Design Knowledge

    NASA Astrophysics Data System (ADS)

    Haesen, Mieke; van den Bergh, Jan; Meskens, Jan; Luyten, Kris; Degrandsart, Sylvain; Demeyer, Serge; Coninx, Karin

    Model-driven development of user interfaces has become increasingly powerful in recent years. Unfortunately, model-driven approaches have the inherent limitation that they cannot handle the informal nature of some of the artifacts used in truly multidisciplinary user interface development such as storyboards, sketches, scenarios and personas. In this chapter, we present an approach and tool support for multidisciplinary user interface development bridging informal and formal artifacts in the design and development process. Key features of the approach are the usage of annotated storyboards, which can be connected to other models through an underlying meta-model, and cross-toolkit design support based on an abstract user interface model.

  13. Mathematical modelling in the computer-aided process planning

    NASA Astrophysics Data System (ADS)

    Mitin, S.; Bochkarev, P.

    2016-04-01

    This paper presents new approaches to organization of manufacturing preparation and mathematical models related to development of the computer-aided multi product process planning (CAMPP) system. CAMPP system has some peculiarities compared to the existing computer-aided process planning (CAPP) systems: fully formalized developing of the machining operations; a capacity to create and to formalize the interrelationships among design, process planning and process implementation; procedures for consideration of the real manufacturing conditions. The paper describes the structure of the CAMPP system and shows the mathematical models and methods to formalize the design procedures.

  14. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  15. Improving digital human modelling for proactive ergonomics in design.

    PubMed

    Chaffin, D B

    2005-04-15

    This paper presents the need to improve existing digital human models (DHMs) so they are better able to serve as effective ergonomics analysis and design tools. Existing DHMs are meant to be used by a designer early in a product development process when attempting to improve the physical design of vehicle interiors and manufacturing workplaces. The emphasis in this paper is placed on developing future DHMs that include valid posture and motion prediction models for various populations. It is argued that existing posture and motion prediction models now used in DHMs must be changed to become based on real motion data to assure validity for complex dynamic task simulations. It is further speculated that if valid human posture and motion prediction models are developed and used, these can be combined with psychophysical and biomechanical models to provide a much greater understanding of dynamic human performance and population specific limitations and that these new DHM models will ultimately provide a powerful ergonomics design tool.

  16. The Use of Executive Control Processes in Engineering Design by Engineering Students and Professional Engineers

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Johnson, Scott D.

    2012-01-01

    A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…

  17. An Inorganic Microsphere Composite for the Selective Removal of 137 Cesium from Acidic Nuclear Waste Solutions 2: Bench-Scale Column Experiments, Modeling, and Preliminary Process Design

    SciTech Connect

    Troy J. Tranter; T. A. Vereschagina; V. Utgikar

    2009-03-01

    A new inorganic ion exchange composite for removing radioactive cesium from acidic waste streams has been developed. The new material consists of ammonium molybdophosphate, (NH4)3P(Mo3O10)4?3H2O (AMP), synthesized within hollow aluminosilicate microspheres (AMP-C), which are produced as a by-product from coal combustion. The selective cesium exchange capacity of this inorganic composite was evaluated in bench-scale column tests using simulated sodium bearing waste solution as a surrogate for the acidic tank waste currently stored at the Idaho National Laboratory (INL). Total cesium loading on the columns at saturation agreed very well with equilibrium values predicted from isotherm experiments performed previously. A numerical algorithm for solving the governing partial differential equations (PDE) for cesium uptake was developed using the intraparticle mass transfer coefficient obtained from previous batch kinetic experiments. Solutions to the governing equations were generated to obtain the cesium concentration at the column effluent as a function of throughput volume using the same conditions as those used for the actual column experiments. The numerical solutions of the PDE fit the column break through data quite well for all the experimental conditions in the study. The model should therefore provide a reliable prediction of column performance at larger scales.

  18. Systematic Approach to Computational Design of Gene Regulatory Networks with Information Processing Capabilities.

    PubMed

    Moskon, Miha; Mraz, Miha

    2014-01-01

    We present several measures that can be used in de novo computational design of biological systems with information processing capabilities. Their main purpose is to objectively evaluate the behavior and identify the biological information processing structures with the best dynamical properties. They can be used to define constraints that allow one to simplify the design of more complex biological systems. These measures can be applied to existent computational design approaches in synthetic biology, i.e., rational and automatic design approaches. We demonstrate their use on a) the computational models of several basic information processing structures implemented with gene regulatory networks and b) on a modular design of a synchronous toggle switch.

  19. Process-Based Modeling of Constructed Wetlands

    NASA Astrophysics Data System (ADS)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  20. Low-cost EUV collector development: design, process, and fabrication

    NASA Astrophysics Data System (ADS)

    Venables, Ranju D.; Goldstein, Michael; Engelhaupt, Darell; Lee, Sang H.; Panning, Eric M.

    2007-03-01

    Cost of ownership (COO) is an area of concern that may limit the adoption and usage of Extreme Ultraviolet Lithography (EUVL). One of the key optical components that contribute to the COO budget is the collector. The collectors being fabricated today are based on existing x-ray optic design and fabrication processes. The main contributors to collector COO are fabrication cost and lifetime. We present experimental data and optical modeling to demonstrate a roadmap for optimized efficiency and a possible approach for significant reduction in collector COO. Current state of the art collectors are based on a Wolter type-1 design and have been adapted from x-ray telescopes. It uses a long format that is suitable for imaging distant light sources such as stars. As applied to industrial equipment and very bright nearby sources, however, a Wolter collector tends to be expensive and requires significant debris shielding and integrated cooling solutions due to the source proximity and length of the collector shells. Three collector concepts are discussed in this work. The elliptical collector that has been used as a test bed to demonstrate alternative cost effective fabrication method has been optimized for collection efficiency. However, this fabrication method can be applied to other optical designs as well. The number of shells and their design may be modified to increase the collection efficiency and to accommodate different EUV sources The fabrication process used in this work starts with a glass mandrel, which is elliptical on the inside. A seed layer is coated on the inside of the glass mandrel, which is then followed by electroplating nickel. The inside/exposed surface of the electroformed nickel is then polished to meet the figure and finish requirements for the particular shell and finally coated with Ru or a multilayer film depending on the angle of incidence of EUV light. Finally the collector shell is released from the inside surface of the mandrel. There are

  1. Process Design Manual for Land Treatment of Municipal Wastewater.

    ERIC Educational Resources Information Center

    Crites, R.; And Others

    This manual presents a procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are given emphasis. The basic unit operations and unit processes are discussed in detail, and the design concepts and criteria are presented. The manual includes design…

  2. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  3. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  4. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  5. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  6. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  7. The Use of Computer Graphics in the Design Process.

    ERIC Educational Resources Information Center

    Palazzi, Maria

    This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…

  8. Freshman Interest Groups: Designing a Model for Success

    ERIC Educational Resources Information Center

    Ratliff, Gerald Lee

    2008-01-01

    Freshman Interest Groups (FIGS) have become a popular model for academic and student affairs colleagues who are concerned that first-year students learn to reflect on life experiences and daily events as part of the learning process. A well-designed FIG model meets the academic, social and career concerns for first-year students by providing an…

  9. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  10. Integration of MGDS design into the licensing process

    SciTech Connect

    1997-12-01

    This paper presents an overview of how the Mined Geologic Disposal System (MGDS) design for a potential repository is integrated into the licensing process. The integration process employs a two-told approach: (1) ensure that the MGDS design complies with applicable Nuclear Regulatory Commission (NRC) licensing requirements, and (2) ensure that the MGDS design is appropriately reflected in a license application that is acceptable to the NRC for performing acceptance and compliance reviews.

  11. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  12. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  13. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  14. Design and programming of systolic array cells for signal processing

    SciTech Connect

    Smith, R.A.W.

    1989-01-01

    This thesis presents a new methodology for the design, simulation, and programming of systolic arrays in which the algorithms and architecture are simultaneously optimized. The algorithms determine the initial architecture, and simulation is used to optimize the architecture. The simulator provides a register-transfer level model of a complete systolic array computation. To establish the validity of this design methodology two novel programmable systolic array cells were designed and programmed. The cells were targeted for applications in high-speed signal processing and associated matrix computations. A two-chip programmable systolic array cell using a 16-bit multiplier-accumulator chip and a semi-custom VLSI controller chip was designed and fabricated. A low chip count allows large arrays to be constructed, but the cell is flexible enough to be a building-block for either one- or two-dimensional systolic arrays. Another more flexible and powerful cell using a 32-bit floating-point processor and a second VLSI controller chip was also designed. It contains several architectural features that are unique in a systolic array cell: (1) each instruction is 32 bits, yet all resources can be updated every cycle, (2) two on-chip interchangeable memories are used, and (3) one input port can be used as either a global or local port. The key issues involved in programming the cells are analyzed in detail. A set of modules is developed which can be used to construct large programs in an effective manner. The utility of this programming approach is demonstrated with several important examples.

  15. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  16. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  17. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  18. Signal Processing Model for Radiation Transport

    SciTech Connect

    Chambers, D H

    2008-07-28

    This note describes the design of a simplified gamma ray transport model for use in designing a sequential Bayesian signal processor for low-count detection and classification. It uses a simple one-dimensional geometry to describe the emitting source, shield effects, and detector (see Fig. 1). At present, only Compton scattering and photoelectric absorption are implemented for the shield and the detector. Other effects may be incorporated in the future by revising the expressions for the probabilities of escape and absorption. Pair production would require a redesign of the simulator to incorporate photon correlation effects. The initial design incorporates the physical effects that were present in the previous event mode sequence simulator created by Alan Meyer. The main difference is that this simulator transports the rate distributions instead of single photons. Event mode sequences and other time-dependent photon flux sequences are assumed to be marked Poisson processes that are entirely described by their rate distributions. Individual realizations can be constructed from the rate distribution using a random Poisson point sequence generator.

  19. Minority Utility Rate Design Assessment Model

    SciTech Connect

    Poyer, David A.; Butler, John G.

    2003-01-20

    Econometric model simulates consumer demand response to various user-supplied, two-part tariff electricity rate designs and assesses their economic welfare impact on black, hispanic, poor and majority households.

  20. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  1. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  2. A Reflexive Model for Teaching Instructional Design.

    ERIC Educational Resources Information Center

    Shambaugh, Neal; Magliaro, Susan

    2001-01-01

    Documents a five-year study of two instructors who collaborated on formally studying their teaching of a master's level instructional design course. Outlines their views on learning, teaching, and instructional design (ID), describes the ID course, and explains the reflexive instructional model used, in which the teachers examined their teaching…

  3. A Model for Teaching Information Design

    ERIC Educational Resources Information Center

    Pettersson, Rune

    2011-01-01

    The author presents his views on the teaching of information design. The starting point includes some general aspects of teaching and learning. The multidisciplinary structure and content of information design as well as the combined practical and theoretical components influence studies of the discipline. Experiences from working with a model for…

  4. GREENER CHEMICAL PROCESS DESIGN ALTERNATIVES ARE REVEALED USING THE WASTE REDUCTION DECISION SUPPORT SYSTEM (WAR DSS)

    EPA Science Inventory

    The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...

  5. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs.

  6. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. PMID:24616438

  7. Design of experiments in Biomedical Signal Processing Course.

    PubMed

    Li, Ling; Li, Bin

    2008-01-01

    Biomedical Signal Processing is one of the most important major subjects in Biomedical Engineering. The contents of Biomedical Signal Processing include the theories of digital signal processing, the knowledge of different biomedical signals, physiology and the ability of computer programming. Based on our past five years teaching experiences, in order to let students master the signal processing algorithm well, we found that the design of experiments following algorithm was very important. In this paper we presented the ideas and aims in designing the experiments. The results showed that our methods facilitated the study of abstractive signal processing algorithms and made understanding of biomedical signals in a simple way.

  8. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  9. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2013-10-17

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to a NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions distributions, and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:24144977

  10. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:26353243

  11. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, S.D.

    1998-07-01

    The sequential manner in which materials and processes for a manufactured product are selected is inherently less than optimal. Designers` tendency to choose processes and materials with which they are familiar exacerbate this problem. A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach is presented.

  12. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  13. Design driven test patterns for OPC models calibration

    NASA Astrophysics Data System (ADS)

    Al-Imam, Mohamed

    2009-03-01

    In the modern photolithography process for manufacturing integrated circuits, geometry dimensions need to be realized on silicon which are much smaller than the exposure wavelength. Thus Resolution Enhancement Techniques have an indispensable role towards the implementation of a successful technology process node. Finding an appropriate RET recipe, that answers the needs of a certain fabrication process, usually involves intensive computational simulations. These simulations have to reflect how different elements in the lithography process under study will behave. In order to achieve this, accurate models are needed that truly represent the transmission of patterns from mask to silicon. A common practice in calibrating lithography models is to collect data for the dimensions of some test structures created on the exposure mask along with the corresponding dimensions of these test structures on silicon after exposure. This data is used to tune the models for good predictions. The models will be guaranteed to accurately predict the test structures that has been used in its tuning. However, real designs might have a much greater variety of structures that might not have been included in the test structures. This paper explores a method for compiling the test structures to be used in the model calibration process using design layouts as an input. The method relies on reducing structures in the design layout to the essential unique structure from the lithography models point of view, and thus ensuring that the test structures represent what the model would actually have to predict during the simulations.

  14. Groundwater modeling and remedial optimization design using graphical user interfaces

    SciTech Connect

    Deschaine, L.M.

    1997-05-01

    The ability to accurately predict the behavior of chemicals in groundwater systems under natural flow circumstances or remedial screening and design conditions is the cornerstone to the environmental industry. The ability to do this efficiently and effectively communicate the information to the client and regulators is what differentiates effective consultants from ineffective consultants. Recent advances in groundwater modeling graphical user interfaces (GUIs) are doing for numerical modeling what Windows{trademark} did for DOS{trademark}. GUI facilitates both the modeling process and the information exchange. This Test Drive evaluates the performance of two GUIs--Groundwater Vistas and ModIME--on an actual groundwater model calibration and remedial design optimization project. In the early days of numerical modeling, data input consisted of large arrays of numbers that required intensive labor to input and troubleshoot. Model calibration was also manual, as was interpreting the reams of computer output for each of the tens or hundreds of simulations required to calibrate and perform optimal groundwater remedial design. During this period, the majority of the modelers effort (and budget) was spent just getting the model running, as opposed to solving the environmental challenge at hand. GUIs take the majority of the grunt work out of the modeling process, thereby allowing the modeler to focus on designing optimal solutions.

  15. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  16. Designers Workbench: Towards Real-Time Immersive Modeling

    SciTech Connect

    Kuester, F; Duchaineau, M A; Hamann, B; Joy, K I; Ma, K L

    2001-10-03

    This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technology or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  17. Hairy root culture: bioreactor design and process intensification.

    PubMed

    Stiles, Amanda R; Liu, Chun-Zhao

    2013-01-01

    The cultivation of hairy roots for the production of secondary metabolites offers numerous advantages; hairy roots have a fast growth rate, are genetically stable, and are relatively simple to maintain in phytohormone free media. Hairy roots provide a continuous source of secondary metabolites, and are useful for the production of chemicals for pharmaceuticals, cosmetics, and food additives. In order for hairy roots to be utilized on a commercial scale, it is necessary to scale-up their production. Over the last several decades, significant research has been conducted on the cultivation of hairy roots in various types of bioreactor systems. In this review, we discuss the advantages and disadvantages of various bioreactor systems, the major factors related to large-scale bioreactor cultures, process intensification technologies and overview the mathematical models and computer-aided methods that have been utilized for bioreactor design and development.

  18. fMRI paradigm designing and post-processing tools

    PubMed Central

    James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  19. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  20. Nursing job process analysis from viewpoint of process design by job diagram.

    PubMed

    Dannoue, Hideo; Tsuru, Satoko; Munechika, Masahiko; Iizuka, Yoshinori

    2006-01-01

    Recently Japan demands more and more quality assurance in clinical practice. Several aspects of issues have been discussed to provide significant suggestions for nursing quality assurance. In the quality management field, Process Design, which is known to contribute to quality assurance, is an important frame. This study attempts to analyze the nursing job process from the viewpoint of process design. As a result, some knowledge on the nursing job process could be comprehended. Process analysis from the viewpoint of Process Design is considered significant in nursing practice and further improvement of its technique and application is a challenge for the future.

  1. Universal process modeling with VTRE for OPC

    NASA Astrophysics Data System (ADS)

    Granik, Yuri; Cobb, Nicolas B.; Do, Thuy

    2002-07-01

    In previous work, Cobb and Zakhor (SPIE, 2726, pp.208-222, 1996) introduced the VTR (Variable Threshold Resist) model and demonstrated its accuracy for fitting empirical data for 365 nm illumination (SPIE, 3051, pp. 458-468, 1997). The original work showed how EPE can be modeled as a function of a peak local image intensity and the slope of the adjacent cutline. Since then, authors such as J. Randall, et al., (Microel. Engineering, 46, pp. 59-63, 1999) have analyzed the VTR model including other parameters such as dose. In the current approach, the original VTR has been enhanced to the VTR-Enhanced (or VTRE) in 1999, and VT-5 models in 2002, for production in OPC applications, which include other image intensity parameters. Here we present a comprehensive report on VT (Variable Threshold) process modeling. It has the demonstrated ability to accurately capture resist and etching responses, alone or in the combination with experimental VEB (Variable Etch Bias, SPIE, 4346, p. 98, 2001) model, for a wide range of process conditions used in the contemporary IC manufacturing. We analyzed 14 different semiconductor company processes experimental setups totaling 3000 CD measurements to prove this point. We considered 248, 193, and 157 nm annular and standard illumination sources for poly, metal, and active layers. We report an accuracy of VT family models under a wide range of conditions, show usage methodology, and introduce a novel method for calculating VTRE wafer predictions on a dense image intensity grid. We use multiple regression method to fit VT models and discuss methods for calculating regression coefficients. It is shown that models with too many eigenvectors exhibit a tendency to overfit CD curves. Sub-sample cross-validation and overfitting criteria are derived to avoid this problem. The section on test pattern and usage methodology describes practical issues needed for VT usage in OPC modeling. Particularly we discuss the effects of metrology errors on

  2. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  3. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1993-01-01

    Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.

  4. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  5. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  6. Stimulus design for model selection and validation in cell signaling.

    PubMed

    Apgar, Joshua F; Toettcher, Jared E; Endy, Drew; White, Forest M; Tidor, Bruce

    2008-02-01

    Mechanism-based chemical kinetic models are increasingly being used to describe biological signaling. Such models serve to encapsulate current understanding of pathways and to enable insight into complex biological processes. One challenge in model development is that, with limited experimental data, multiple models can be consistent with known mechanisms and existing data. Here, we address the problem of model ambiguity by providing a method for designing dynamic stimuli that, in stimulus-response experiments, distinguish among parameterized models with different topologies, i.e., reaction mechanisms, in which only some of the species can be measured. We develop the approach by presenting two formulations of a model-based controller that is used to design the dynamic stimulus. In both formulations, an input signal is designed for each candidate model and parameterization so as to drive the model outputs through a target trajectory. The quality of a model is then assessed by the ability of the corresponding controller, informed by that model, to drive the experimental system. We evaluated our method on models of antibody-ligand binding, mitogen-activated protein kinase (MAPK) phosphorylation and de-phosphorylation, and larger models of the epidermal growth factor receptor (EGFR) pathway. For each of these systems, the controller informed by the correct model is the most successful at designing a stimulus to produce the desired behavior. Using these stimuli we were able to distinguish between models with subtle mechanistic differences or where input and outputs were multiple reactions removed from the model differences. An advantage of this method of model discrimination is that it does not require novel reagents, or altered measurement techniques; the only change to the experiment is the time course of stimulation. Taken together, these results provide a strong basis for using designed input stimuli as a tool for the development of cell signaling models. PMID

  7. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  8. Interior Design Research: A Human Ecosystem Model.

    ERIC Educational Resources Information Center

    Guerin, Denise A.

    1992-01-01

    The interior ecosystems model illustrates effects on the human organism of the interaction of the natural, behavioral, and built environment. Examples of interior lighting and household energy consumption show the model's flexibility for organizing study variables in interior design research. (SK)

  9. Instructional Design in Education: New Model

    ERIC Educational Resources Information Center

    Isman, Aytekin

    2011-01-01

    The main goal of the new instructional design model is to organize long term and full learning activities. The new model is based on the theoretical foundation of behaviorism, cognitivism and constructivism. During teaching and learning activities, learners are active and use cognitive, constructivist, or behaviorist learning to construct new…

  10. Designers' models of the human-computer interface

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Breedin, Sarah D.

    1993-01-01

    Understanding design models of the human-computer interface (HCI) may produce two types of benefits. First, interface development often requires input from two different types of experts: human factors specialists and software developers. Given the differences in their backgrounds and roles, human factors specialists and software developers may have different cognitive models of the HCI. Yet, they have to communicate about the interface as part of the design process. If they have different models, their interactions are likely to involve a certain amount of miscommunication. Second, the design process in general is likely to be guided by designers' cognitive models of the HCI, as well as by their knowledge of the user, tasks, and system. Designers do not start with a blank slate; rather they begin with a general model of the object they are designing. The author's approach to a design model of the HCI was to have three groups make judgments of categorical similarity about the components of an interface: human factors specialists with HCI design experience, software developers with HCI design experience, and a baseline group of computer users with no experience in HCI design. The components of the user interface included both display components such as windows, text, and graphics, and user interaction concepts, such as command language, editing, and help. The judgments of the three groups were analyzed using hierarchical cluster analysis and Pathfinder. These methods indicated, respectively, how the groups categorized the concepts, and network representations of the concepts for each group. The Pathfinder analysis provides greater information about local, pairwise relations among concepts, whereas the cluster analysis shows global, categorical relations to a greater extent.

  11. Modeling and observer design for recombinant Escherichia coli strain.

    PubMed

    Nadri, M; Trezzani, I; Hammouri, H; Dhurjati, P; Longin, R; Lieto, J

    2006-03-01

    A mathematical model for recombinant bacteria which includes foreign protein production is developed. The experimental system consists of an Escherichia Coli strain and plasmid pIT34 containing genes for bioluminescence and production of a protein, beta-galactosidase. This recombinant strain is constructed to facilitate on-line estimation and control in a complex bioprocess. Several batch experiments are designed and performed to validate the developed model. The design of a model structure, the identification of the model parameters and the estimation problem are three parts of a joint design problem. A nonlinear observer is designed and an experimental evaluation is performed on a batch fermentation process to estimate the substrate consumption. PMID:16411071

  12. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  13. SR-7A aeroelastic model design report

    NASA Technical Reports Server (NTRS)

    Nagle, D.; Auyeung, S.; Turnberg, J.

    1986-01-01

    A scale model was designed to simulate the aeroelastic characteristics and performance of the 2.74 meter (9 ft.) diameter SR-7L blade. The procedures used in this model blade design are discussed. Included in this synopsis is background information concerning scaling parameters and an explanation of manufacturing limitations. A description of the final composite model blade, made of titanium, fiberglass, and graphite, is provided. Analytical methods for determining the blade stresses, natural frequencies and mode shapes, and stability are discussed at length.

  14. Innovation Process Design: A Change Management and Innovation Dimension Perspective

    NASA Astrophysics Data System (ADS)

    Peisl, Thomas; Reger, Veronika; Schmied, Juergen

    The authors propose an innovative approach to the management of innovation integrating business, process, and maturity dimensions. Core element of the concept is the adaptation of ISO/IEC 15504 to the innovation process including 14 innovation drivers. Two managerial models are applied to conceptualize and visualize the respective innovation strategies, the Balanced Scorecard and a Barriers in Change Processes Model. An illustrative case study shows a practical implementation process.

  15. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  16. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  17. Propulsion System Models for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2014-01-01

    The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.

  18. The New Digital Engineering Design and Graphics Process.

    ERIC Educational Resources Information Center

    Barr, R. E.; Krueger, T. J.; Aanstoos, T. A.

    2002-01-01

    Summarizes the digital engineering design process using software widely available for the educational setting. Points out that newer technology used in the field is not used in engineering graphics education. (DDR)

  19. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  20. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  1. Materials measurement and accounting in an operating plutonium conversion and purification process. Phase I. Process modeling and simulation. [PUCSF code

    SciTech Connect

    Thomas, C.C. Jr.; Ostenak, C.A.; Gutmacher, R.G.; Dayem, H.A.; Kern, E.A.

    1981-04-01

    A model of an operating conversion and purification process for the production of reactor-grade plutonium dioxide was developed as the first component in the design and evaluation of a nuclear materials measurement and accountability system. The model accurately simulates process operation and can be used to identify process problems and to predict the effect of process modifications.

  2. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  3. A verification and validation process for model-driven engineering

    NASA Astrophysics Data System (ADS)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  4. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  5. Modeling the Reading Process: Promise and Problems.

    ERIC Educational Resources Information Center

    Geyer, John J.

    The problems of modeling a process as complex as reading are discussed, including such factors as the lack of agreement surrounding definitions of modeling, varying levels of rigor within and between models, the disjunctive categories within which models fall, and the difficulty of synthesis across fields which employ very different technical…

  6. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  7. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  8. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  9. Applying the ID Process to the Guided Design Teaching Strategy.

    ERIC Educational Resources Information Center

    Coscarelli, William C.; White, Gregory P.

    1982-01-01

    Describes the application of the instructional development process to a teaching technique called Guided Design in a Production-Operations Management course. In Guided Design, students are self-instructed in course content and use class time to apply this knowledge to self-instruction; in-class problem-solving is stressed. (JJD)

  10. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  11. Process Design Report for Stover Feedstock: Lignocellulosic Biomass to Ethanol Process Design and Economics Utilizing Co-Current Dilute Acid Prehydrolysis and Enzymatic Hydrolysis for Corn Stover

    SciTech Connect

    Aden, A.; Ruth, M.; Ibsen, K.; Jechura, J.; Neeves, K.; Sheehan, J.; Wallace, B.; Montague, L.; Slayton, A.; Lukas, J.

    2002-06-01

    The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update of the ongoing process design and economic analyses at NREL.

  12. Optimum Design Of Addendum Surfaces In Sheet Metal Forming Process

    NASA Astrophysics Data System (ADS)

    Debray, K.; Sun, Z. C.; Radjai, R.; Guo, Y. Q.; Dai, L.; Gu, Y. X.

    2004-06-01

    The design of addendum surfaces in sheet forming process is very important for the product quality, but it is very time-consuming and needs tedious trial-error corrections. In this paper, we propose a methodology to automatically generate the addendum surfaces and then to optimize them using a forming modelling solver. The surfaces' parameters are taken as design variables and modified in course of optimization. The finite element mesh is created on the initial addendum surfaces and mapped onto the modified surfaces without remeshing operation. The Feasible Sequential Quadratic Programming (FSQP) is adopted as our algorithm of optimization. Two objective functions are used: the first one is the thickness function to minimize the thickness variation on the workpiece ; the second one is the appearance function aiming to avoid the scratching defects on the external surfaces of panels. The FSQP is combined with our "Inverse Approach" or "One Step Approach" which is a very fast forming solver. This leads to a very efficient optimization procedure. The present methodology is applied to a square box. The addendum surfaces are characterised by four geometrical variables. The influence of optimization criteria is studied and discussed.

  13. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  14. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  15. Panoramic imaging perimeter sensor design and modeling

    SciTech Connect

    Pritchard, D.A.

    1993-12-31

    This paper describes the conceptual design and preliminary performance modeling of a 360-degree imaging sensor. This sensor combines automatic perimeter intrusion detection with immediate visual assessment and is intended to be used for fast deployment around fixed or temporary high-value assets. The sensor requirements, compiled from various government agencies, are summarized. The conceptual design includes longwave infrared and visible linear array technology. An auxiliary millimeter-wave sensing technology is also considered for use during periods of infrared and visible obscuration. The infrared detectors proposed for the sensor design are similar to the Standard Advanced Dewar Assembly Types Three A and B (SADA-IIIA/B). An overview of the sensor and processor is highlighted. The infrared performance of this sensor design has been predicted using existing thermal imaging system models and is described in the paper. Future plans for developing a prototype are also presented.

  16. A Biologically Inspired Network Design Model

    PubMed Central

    Zhang, Xiaoge; Adamatzky, Andrew; Chan, Felix T.S.; Deng, Yong; Yang, Hai; Yang, Xin-She; Tsompanas, Michail-Antisthenis I.; Sirakoulis, Georgios Ch.; Mahadevan, Sankaran

    2015-01-01

    A network design problem is to select a subset of links in a transport network that satisfy passengers or cargo transportation demands while minimizing the overall costs of the transportation. We propose a mathematical model of the foraging behaviour of slime mould P. polycephalum to solve the network design problem and construct optimal transport networks. In our algorithm, a traffic flow between any two cities is estimated using a gravity model. The flow is imitated by the model of the slime mould. The algorithm model converges to a steady state, which represents a solution of the problem. We validate our approach on examples of major transport networks in Mexico and China. By comparing networks developed in our approach with the man-made highways, networks developed by the slime mould, and a cellular automata model inspired by slime mould, we demonstrate the flexibility and efficiency of our approach. PMID:26041508

  17. Model-based risk analysis of coupled process steps.

    PubMed

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  18. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    SciTech Connect

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  19. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  20. Application of hazard assessment techniques in the CISF design process

    SciTech Connect

    Thornton, J.R.; Henry, T.

    1997-10-29

    The Department of Energy has submitted to the NRC staff for review a topical safety analysis report (TSAR) for a Centralized Interim Storage Facility (CISF). The TSAR will be used in licensing the CISF when and if a site is designated. CISF1 design events are identified based on thorough review of design basis events (DBEs) previously identified by dry storage system suppliers and licensees and through the application of hazard assessment techniques. A Preliminary Hazards Assessment (PHA) is performed to identify design events applicable to a Phase 1 non site specific CISF. A PHA is deemed necessary since the Phase 1 CISF is distinguishable from previous dry store applications in several significant operational scope and design basis aspects. In addition to assuring all design events applicable to the Phase 1 CISF are identified, the PHA served as an integral part of the CISF design process by identifying potential important to safety and defense in depth facility design and administrative control features. This paper describes the Phase 1 CISF design event identification process and summarizes significant PHA contributions to the CISF design.

  1. The start up as a phase of architectural design process.

    PubMed

    Castro, Iara Sousa; Lima, Francisco de Paula Antunes; Duarte, Francisco José de Castro Moura

    2012-01-01

    Alterations made in the architectural design can be considered as a continuous process, from its conception to the moment a built environment is already in use. This article focuses on the "moving phase", which is the initial moment of the environment occupation and the start-up of services. It aims to show that the continuity of ergonomics interventions during the "moving phase" or start up may reveal the built environment inadequacies; clearly showing needs not met by the design and allowing making instant decisions to solve non-foreseen problems. The results have revealed some lessons experienced by users during a critical stage not usually included in the design process.

  2. Drought processes, modeling, and mitigation

    NASA Astrophysics Data System (ADS)

    Mishra, Ashok K.; Sivakumar, Bellie; Singh, Vijay P.

    2015-07-01

    Accurate assessment of droughts is crucial for proper planning and management of our water resources, environment, and ecosystems. The combined influence of increasing water demands and the anticipated impacts of global climate change has already raised serious concerns about worsening drought conditions in the future and their social, economic, and environmental impacts. As a result, studies on droughts are currently a major focal point for a broad range of research communities, including civil engineers, hydrologists, environmentalists, ecologists, meteorologists, geologists, agricultural scientists, economists, policy makers, and water managers. There is, therefore, an urgent need for enhancing our understanding of droughts (e.g. occurrence, modeling), making more reliable assessments of their impacts on various sectors of our society (e.g. domestic, agricultural, industrial), and undertaking appropriate adaptation and mitigation measures, especially in the face of global climate change.

  3. Maintaining Compliance in Customizable Process Models

    NASA Astrophysics Data System (ADS)

    Schleicher, Daniel; Anstett, Tobias; Leymann, Frank; Mietzner, Ralph

    Compliance of business processes has gained importance during the last years. The growing number of internal and external regulations that companies need to obey has led to this state. This paper presents a practical concept of ensuring compliance during design time of customizable business processes.

  4. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  5. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  6. Declarative business process modelling: principles and modelling languages

    NASA Astrophysics Data System (ADS)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  7. Lithography process window analysis with calibrated model

    NASA Astrophysics Data System (ADS)

    Zhou, Wenzhan; Yu, Jin; Lo, James; Liu, Johnson

    2004-05-01

    As critical-dimension shrink below 0.13 μm, the SPC (Statistical Process Control) based on CD (Critical Dimension) control in lithography process becomes more difficult. Increasing requirements of a shrinking process window have called on the need for more accurate decision of process window center. However in practical fabrication, we found that systematic error introduced by metrology and/or resist process can significantly impact the process window analysis result. Especially, when the simple polynomial functions are used to fit the lithographic data from focus exposure matrix (FEM), the model will fit these systematic errors rather than filter them out. This will definitely impact the process window analysis and determination of the best process condition. In this paper, we proposed to use a calibrated first principle model to do process window analysis. With this method, the systematic metrology error can be filtered out efficiently and give a more reasonable window analysis result.

  8. Applying learning theories and instructional design models for effective instruction.

    PubMed

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory.

  9. Development of a Comprehensive Weld Process Model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

  10. Model development for naphthenic acids ozonation process.

    PubMed

    Al Jibouri, Ali Kamel H; Wu, Jiangning

    2015-02-01

    Naphthenic acids (NAs) are toxic constituents of oil sands process-affected water (OSPW) which is generated during the extraction of bitumen from oil sands. NAs consist mainly of carboxylic acids which are generally biorefractory. For the treatment of OSPW, ozonation is a very beneficial method. It can significantly reduce the concentration of NAs and it can also convert NAs from biorefractory to biodegradable. In this study, a factorial design (2(4)) was used for the ozonation of OSPW to study the influences of the operating parameters (ozone concentration, oxygen/ozone flow rate, pH, and mixing) on the removal of a model NAs in a semi-batch reactor. It was found that ozone concentration had the most significant effect on the NAs concentration compared to other parameters. An empirical model was developed to correlate the concentration of NAs with ozone concentration, oxygen/ozone flow rate, and pH. In addition, a theoretical analysis was conducted to gain the insight into the relationship between the removal of NAs and the operating parameters. PMID:25189805

  11. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  12. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals. PMID:16604700

  13. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  14. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  15. Natural gas operations: considerations on process transients, design, and control.

    PubMed

    Manenti, Flavio

    2012-03-01

    This manuscript highlights tangible benefits deriving from the dynamic simulation and control of operational transients of natural gas processing plants. Relevant improvements in safety, controllability, operability, and flexibility are obtained not only within the traditional applications, i.e. plant start-up and shutdown, but also in certain fields apparently time-independent such as the feasibility studies of gas processing plant layout and the process design of processes. Specifically, this paper enhances the myopic steady-state approach and its main shortcomings with respect to the more detailed studies that take into consideration the non-steady state behaviors. A portion of a gas processing facility is considered as case study. Process transients, design, and control solutions apparently more appealing from a steady-state approach are compared to the corresponding dynamic simulation solutions.

  16. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  17. Concurrent materials and process selection in conceptual design

    SciTech Connect

    Kleban, Stephen D.; Knorovsky, Gerald A.

    2000-08-16

    A method for concurrent selection of materials and a joining process based on product requirements using a knowledge-based, constraint satisfaction approach facilitates the product design and manufacturing process. Using a Windows-based computer video display and a data base of materials and their properties, the designer can ascertain the preferred composition of two parts based on various operating/environmental constraints such as load, temperature, lifetime, etc. Optimum joinder of the two parts may simultaneously be determined using a joining process data base based upon the selected composition of the components as well as the operating/environmental constraints.

  18. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  19. Learning from Experts: Fostering Extended Thinking in the Early Phases of the Design Process

    ERIC Educational Resources Information Center

    Haupt, Grietjie

    2015-01-01

    Empirical evidence on the way in which expert designers from different domains cognitively connect their internal processes with external resources is presented in the context of an extended cognition model. The article focuses briefly on the main trends in the extended design cognition theory and in particular on recent trends in information…

  20. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  1. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  2. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  3. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  4. Design Models as Emergent Features: An Empirical Study in Communication and Shared Mental Models in Instructional

    ERIC Educational Resources Information Center

    Botturi, Luca

    2006-01-01

    This paper reports the results of an empirical study that investigated the instructional design process of three teams involved in the development of an e-­learning unit. The teams declared they were using the same fast-­prototyping design and development model, and were composed of the same roles (although with a different number of SMEs).…

  5. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  6. A novel process for recovery of fermentation-derived succinic acid: process design and economic analysis.

    PubMed

    Orjuela, Alvaro; Orjuela, Andrea; Lira, Carl T; Miller, Dennis J

    2013-07-01

    Recovery and purification of organic acids produced in fermentation constitutes a significant fraction of total production cost. In this paper, the design and economic analysis of a process to recover succinic acid (SA) via dissolution and acidification of succinate salts in ethanol, followed by reactive distillation to form succinate esters, is presented. Process simulation was performed for a range of plant capacities (13-55 million kg/yr SA) and SA fermentation titers (50-100 kg/m(3)). Economics were evaluated for a recovery system installed within an existing fermentation facility producing succinate salts at a cost of $0.66/kg SA. For a SA processing capacity of 54.9 million kg/yr and a titer of 100 kg/m(3) SA, the model predicts a capital investment of $75 million and a net processing cost of $1.85 per kg SA. Required selling price of diethyl succinate for a 30% annual return on investment is $1.57 per kg.

  7. Preliminary shuttle structural dynamics modeling design study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and development of a structural dynamics model of the space shuttle are discussed. The model provides for early study of structural dynamics problems, permits evaluation of the accuracy of the structural and hydroelastic analysis methods used on test vehicles, and provides for efficiently evaluating potential cost savings in structural dynamic testing techniques. The discussion is developed around the modes in which major input forces and responses occur and the significant structural details in these modes.

  8. Communicative processes: a model of communication

    SciTech Connect

    Kimura, T.D.; Gillett, W.D.

    1982-01-01

    The authors introduce a conceptual model of communicative organization as a part of the formal semantic study of distributed computation. The model includes, as communication primitives, three independent modes of communication: mailing, posting and broadcasting. Mailing models thin-wire communication, and posting models shared memory communication. While broadcasting is not prominent in today's parallel programming languages, it has an important role to play in distributed computation. Other fundamental notions in the model are process, symbol, site, process class, symbol class and site class. 8 references.

  9. Models of Problem Solving Processes and Abilities.

    ERIC Educational Resources Information Center

    Feldhusen, John F.; Guthrie, Virginia A.

    1979-01-01

    This paper reviews current models of problem solving to identify results relevant to teachers or instructional developers. Four areas are covered: information processing models, approaches stressing human abilities and factors, creative problem solving models, and other aspects of problem solving. Part of a theme issue on intelligence. (Author/SJL)

  10. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  11. Physics and Process Modeling (PPM) and Other Propulsion R and T. Volume 1; Materials Processing, Characterization, and Modeling; Lifting Models

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This CP contains the extended abstracts and presentation figures of 36 papers presented at the PPM and Other Propulsion R&T Conference. The focus of the research described in these presentations is on materials and structures technologies that are parts of the various projects within the NASA Aeronautics Propulsion Systems Research and Technology Base Program. These projects include Physics and Process Modeling; Smart, Green Engine; Fast, Quiet Engine; High Temperature Engine Materials Program; and Hybrid Hyperspeed Propulsion. Also presented were research results from the Rotorcraft Systems Program and work supported by the NASA Lewis Director's Discretionary Fund. Authors from NASA Lewis Research Center, industry, and universities conducted research in the following areas: material processing, material characterization, modeling, life, applied life models, design techniques, vibration control, mechanical components, and tribology. Key issues, research accomplishments, and future directions are summarized in this publication.

  12. Coupling entropy of co-processing model on social networks

    NASA Astrophysics Data System (ADS)

    Zhang, Zhanli

    2015-08-01

    Coupling entropy of co-processing model on social networks is investigated in this paper. As one crucial factor to determine the processing ability of nodes, the information flow with potential time lag is modeled by co-processing diffusion which couples the continuous time processing and the discrete diffusing dynamics. Exact results on master equation and stationary state are achieved to disclose the formation. In order to understand the evolution of the co-processing and design the optimal routing strategy according to the maximal entropic diffusion on networks, we propose the coupling entropy comprehending the structural characteristics and information propagation on social network. Based on the analysis of the co-processing model, we analyze the coupling impact of the structural factor and information propagating factor on the coupling entropy, where the analytical results fit well with the numerical ones on scale-free social networks.

  13. Power of experimental design studies for the validation of pharmaceutical processes: case study of a multilayer tablet manufacturing process.

    PubMed

    Goutte, F; Guemguem, F; Dragan, C; Vergnault, G; Wehrlé, P

    2002-08-01

    Experimental design studies (EDS) are already widely used in the pharmaceutical industry for drug formulation or process optimization. Rare are the situations in which this methodology is applied for validation purposes. The power of this statistical tool, key element of a global validation strategy, is demonstrated for a multilayer tablet manufacturing process. Applied to the Geomatrix system generally composed of one compression and three granulation processes, time and strictness gains are non-negligible. Experimental design studies are not used in this work for modeling. Introduced at each important step of the process development, they allow for the evaluation of process ruggedness at pilot scale and specifications for full production. A demonstration of the complete control of key process parameters is given, identified throughout preliminary studies.

  14. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  15. Design of a Pu-238 Waste Incineration Process

    SciTech Connect

    Charlesworth, D.L.

    2001-05-29

    Combustible Pu-238 waste is generated as a result of normal operation and decommissioning activity at the Savannah River Plant and is being retrievably stored there. As part of the long-term plan to process the stored waste and current waste in preparation for future disposition, a Pu-238 incineration process is being cold-tested at Savannah River Laboratory (SRL). The incineration process consists of a continuous-feed preparation system, a two-stage, electrically fired incinerator, and a filtration off-gas system. Process equipment has been designed, fabricated, and installed for nonradioactive testing and cold run-in. Design features to maximize the ability to remotely maintain the equipment were incorporated into the process. Interlock, alarm, and control functions are provided by a programmable controller. Cold testing is scheduled to be completed in 1986.

  16. Mould design and casting process improvement on vibrator shell

    NASA Astrophysics Data System (ADS)

    Zhang, Lipan; Fang, Ligao; Chen, Zhong; Song, Kai

    2011-12-01

    Vibrator shell is a part with complex structure. While the vibrator shell is designed and manufactured by traditional sand casting process, more than 80% castings are found the defects of porosity, shrinkage and pouring-shortage at the top. Aiming to the problems in traditional sand casting, this paper focused on the improvement of castings structure and the optimization of casting process. Designing process bar in the gate-channel region which is connected with the gate in castings is used to improve the castings structure, and low speed filling and solidification under high pressure are adopted to optimize the casting process which is finished by self-made four-column type hydraulic machine equipped. It can be seen that the castings quality can be greatly improved by process improvement.

  17. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    . This work is an early attempt that uses a physical-statistical modeling approach to studying selective nanowire growth for the improvement of process yield. In the second research work, the reliability of nano-dielectrics is investigated. As electronic devices get smaller, reliability issues pose new challenges due to unknown underlying physics of failure (i.e., failure mechanisms and modes). This necessitates new reliability analysis approaches related to nano-scale devices. One of the most important nano-devices is the transistor that is subject to various failure mechanisms. Dielectric breakdown is known to be the most critical one and has become a major barrier for reliable circuit design in nano-scale. Due to the need for aggressive downscaling of transistors, dielectric films are being made extremely thin, and this has led to adopting high permittivity (k) dielectrics as an alternative to widely used SiO2 in recent years. Since most time-dependent dielectric breakdown test data on bilayer stacks show significant deviations from a Weibull trend, we have proposed two new approaches to modeling the time to breakdown of bi-layer high-k dielectrics. In the first approach, we have used a marked space-time self-exciting point process to model the defect generation rate. A simulation algorithm is used to generate defects within the dielectric space, and an optimization algorithm is employed to minimize the Kullback-Leibler divergence between the empirical distribution obtained from the real data and the one based on the simulated data to find the best parameter values and to predict the total time to failure. The novelty of the presented approach lies in using a conditional intensity for trap generation in dielectric that is a function of time, space and size of the previous defects. In addition, in the second approach, a k-out-of-n system framework is proposed to estimate the total failure time after the generation of more than one soft breakdown.

  18. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  19. Modeling Cellular Processes in 3-D

    PubMed Central

    Mogilner, Alex; Odde, David

    2011-01-01

    Summary Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated, we must address the issue of modeling cellular processes in 3-D. Here, we highlight recent advances related to 3-D modeling in cell biology. While some processes require full 3-D analysis, we suggest that others are more naturally described in 2-D or 1-D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3-D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling. PMID:22036197

  20. Simulation models and designs for advanced Fischer-Tropsch technology

    SciTech Connect

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for the products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.

  1. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  2. Fuel Conditioning Facility Electrorefiner Process Model

    SciTech Connect

    DeeEarl Vaden

    2005-10-01

    The Fuel Conditioning Facility at the Idaho National Laboratory processes spent nuclear fuel from the Experimental Breeder Reactor II using electro-metallurgical treatment. To process fuel without waiting for periodic sample analyses to assess process conditions, an electrorefiner process model predicts the composition of the electrorefiner inventory and effluent streams. For the chemical equilibrium portion of the model, the two common methods for solving chemical equilibrium problems, stoichiometric and non stoichiometric, were investigated. In conclusion, the stoichiometric method produced equilibrium compositions close to the measured results whereas the non stoichiometric method did not.

  3. OSIRIS Multi-Object Spectroscopy: Mask Design Process

    NASA Astrophysics Data System (ADS)

    Gómez-Velarde, G.; García-Alvarez, D.; Cabrerra-Lavers, A.

    2016-10-01

    The OSIRIS (Optical System for Imaging and Low-Intermediate Resolution Integrated Spectroscopy) instrument at the 10.4 m GTC has offered a multi-object spectroscopic mode since March 2014. In this paper we describe the detailed process of designing a MOS mask for OSIRIS by using the Mask Designer Tool, and give some numbers on the accuracy of the mask manufacture achievable at the telescope for its scientific use.

  4. Aspect-Oriented Design with Reusable Aspect Models

    NASA Astrophysics Data System (ADS)

    Kienzle, Jörg; Al Abed, Wisam; Fleurey, Franck; Jézéquel, Jean-Marc; Klein, Jacques

    The idea behind Aspect-Oriented Modeling (AOM) is to apply aspect-oriented techniques to (software) models with the aim of modularizing crosscutting concerns. This can be done within different modeling notations, at different levels of abstraction, and at different moments during the software development process. This paper demonstrates the applicability of AOM during the software design phase by presenting parts of an aspect-oriented design of a crisis management system. The design solution proposed in this paper is based on the Reusable Aspect Models (RAM) approach, which allows a modeler to express the structure and behavior of a complex system using class, state and sequence diagrams encapsulated in several aspect models. The paper describes how the model of the "create mission" functionality of the server backend can be decomposed into 23 inter-dependent aspect models. The presentation of the design is followed by a discussion on the lessons learned from the case study. Next, RAM is compared to 8 other AOM approaches according to 6 criteria: language, concern composition, asymmetric and symmetric composition, maturity, and tool support. To conclude the paper, a discussion section points out the features of RAM that specifically support reuse.

  5. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  6. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  7. Jovian plasma modeling for mission design

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin

    2015-01-01

    The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and

  8. Jovian Plasma Modeling for Mission Design

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin

    2015-01-01

    The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and

  9. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  10. System design considerations for free-fall materials processing

    NASA Technical Reports Server (NTRS)

    Seidensticker, R. G.

    1974-01-01

    The design constraints for orbiting materials processing systems are dominated by the limitations of the flight vehicle/crew and not by the processes themselves. Although weight, size and power consumption are all factors in the design of normal laboratory equipment, their importance is increased orders of magnitude when the equipment must be used in an orbital facility. As a result, equipment intended for space flight may have little resemblance to normal laboratory apparatus although the function to be performed may be identical. The same considerations influence the design of the experiment itself. The processing requirements must be carefully understood in terms of basic physical parameters rather than defined in terms of equipment operation. Preliminary experiments and analysis are much more vital to the design of a space experiment than they are on earth where iterative development is relatively easy. Examples of these various considerations are illustrated with examples from the M518 and MA-010 systems. While these are specific systems, the conclusions apply to the design of flight materials processing systems both present and future.

  11. EUV Focus Sensor: Design and Modeling

    SciTech Connect

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wavelengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput optimizing the signal-to-noise ratio in the measured intensity contrast.

  12. EUV focus sensor: design and modeling

    NASA Astrophysics Data System (ADS)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wave-lengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput opti-mizing the signal-to-noise ratio in the measured intensity contrast.

  13. Improving Software Development Process through Economic Mechanism Design

    NASA Astrophysics Data System (ADS)

    Yilmaz, Murat; O'Connor, Rory V.; Collins, John

    We introduce the novel concept of applying economic mechanism design to software development process, and aim to find ways to adjust the incentives and disincentives of the software organization to align them with the motivations of the participants in order to maximize the delivered value of a software project. We envision a set of principles to design processes that allow people to be self motivated but constantly working toward project goals. The resulting economic mechanism will rely on game theoretic principles (i.e. Stackelberg games) for leveraging the incentives, goals and motivation of the participants in the service of project and organizational goals.

  14. Distributed processing techniques: interface design for interactive information sharing.

    PubMed

    Wagner, J R; Krumbholz, S D; Silber, L K; Aniello, A J

    1978-01-01

    The Information Systems Division of the University of Iowa Hospitals and Clinics has successfully designed and implemented a set of generalized interface data-handling routines that control message traffic between a satellite minicomputer in a clinical laboratory and a large main-frame computer. A special queue status inquiry transaction has also been developed that displays the current message-processing backlog and other system performance information. The design and operation of these programs are discussed in detail, with special emphasis on the message-queuing and verification techniques required in a distributed processing environment.

  15. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  16. Biostereometric Data Processing In ERGODATA: Choice Of Human Body Models

    NASA Astrophysics Data System (ADS)

    Pineau, J. C.; Mollard, R.; Sauvignon, M.; Amphoux, M.

    1983-07-01

    The definition of human body models was elaborated with anthropometric data from ERGODATA. The first model reduces the human body into a series of points and lines. The second model is well adapted to represent volumes of each segmentary element. The third is an original model built from the conventional anatomical points. Each segment is defined in space by a tri-angular plane located with its 3-D coordinates. This new model can answer all the processing possibilities in the field of computer-aided design (C.A.D.) in ergonomy but also biomechanics and orthopaedics.

  17. Integrating O/S models during conceptual design, part 1

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.

  18. Development of a reburning boiler process model

    SciTech Connect

    Wu, K.T.

    1992-01-30

    The overall objective of this program is to integrate EER's expertise in boiler reburning performance evaluation into a package of analytical computer tools. Specific objectives of the program are to develop a computational capability with the following features: (1) can be used to predict the impact of gas reburning application on thermal conditions in the boiler radiant furnace, and on overall boiler performance; (2) can estimate gas reburning NO{sub x} reduction effectiveness based on specific reburning configurations and furnace/boiler configurations; (3) can be used as an analytical tool to evaluate the impact of boiler process parameters (e.g., fuel switching and changes in boiler operating conditions) on boiler thermal performance; (4) is adaptable to most boiler designs (tangential and wall fire boilers) and a variety of fuels (solid, liquid, gaseous and slurried fuels); (5) is sufficiently user friendly to be exercisable by engineers with a reasonable knowledge of boilers, and with reasonable computer skills. Here, user friendly'' means that the user will be guided by computer codes during the course of setting up individual input files for the boiler performance model.

  19. Using CASE to Exploit Process Modeling in Technology Transfer

    NASA Technical Reports Server (NTRS)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  20. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code