Science.gov

Sample records for process modeling design

  1. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  2. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  3. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  4. Perceptions of Instructional Design Process Models.

    ERIC Educational Resources Information Center

    Branch, Robert Maribe

    Instructional design is a process that is creative, active, iterative and complex; however, many diagrams of instructional design are interpreted as stifling, passive, lock-step and simple because of the visual elements used to model the process. The purpose of this study was to determine the expressed perceptions of the types of flow diagrams…

  5. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  6. Application of Process Modeling Tools to Ship Design

    DTIC Science & Technology

    2011-05-01

    NAVSEA Frank Waldman; LATTIX May 2011 APPLICATION OF PROCESS MODELING TOOLS TO SHIP DESIGN Report Documentation Page Form ApprovedOMB No. 0704-0188...00-00-2011 4. TITLE AND SUBTITLE Application of Process Modeling Tools to Ship Design 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...design teams – Long design schedules – Complicated acquisition procedures • We are applying commercial process modeling techniques for: – Better

  7. Process Model Construction and Optimization Using Statistical Experimental Design,

    DTIC Science & Technology

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  8. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  9. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  10. Mechanistic Fermentation Models for Process Design, Monitoring, and Control.

    PubMed

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-08-21

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  12. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

  13. Type-2 fuzzy model based controller design for neutralization processes.

    PubMed

    Kumbasar, Tufan; Eksin, Ibrahim; Guzelkaya, Mujde; Yesil, Engin

    2012-03-01

    In this study, an inverse controller based on a type-2 fuzzy model control design strategy is introduced and this main controller is embedded within an internal model control structure. Then, the overall proposed control structure is implemented in a pH neutralization experimental setup. The inverse fuzzy control signal generation is handled as an optimization problem and solved at each sampling time in an online manner. Although, inverse fuzzy model controllers may produce perfect control in perfect model match case and/or non-existence of disturbances, this open loop control would not be sufficient in the case of modeling mismatches or disturbances. Therefore, an internal model control structure is proposed to compensate these errors in order to overcome this deficiency where the basic controller is an inverse type-2 fuzzy model. This feature improves the closed-loop performance to disturbance rejection as shown through the real-time control of the pH neutralization process. Experimental results demonstrate the superiority of the inverse type-2 fuzzy model controller structure compared to the inverse type-1 fuzzy model controller and conventional control structures. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  14. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  15. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  16. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  17. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  18. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  19. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  20. Aerospace structural design process improvement using systematic evolutionary structural modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Robert Michael

    2000-10-01

    A multidisciplinary team tasked with an aircraft design problem must understand the problem requirements and metrics to produce a successful design. This understanding entails not only knowledge of what these requirements and metrics are, but also how they interact, which are most important (to the customer as well as to aircraft performance), and who in the organization can provide pertinent knowledge for each. In recent years, product development researchers and organizations have developed and successfully applied a variety of tools such as Quality Function Deployment (QFD) to coordinate multidisciplinary team members. The effectiveness of these methods, however, depends on the quality and fidelity of the information that team members can input. In conceptual aircraft design, structural information is of lower quality compared to aerodynamics or performance because it is based on experience rather than theory. This dissertation shows how advanced structural design tools can be used in a multidisciplinary team setting to improve structural information generation and communication through a systematic evolution of structural detail. When applied to conceptual design, finite element-based structural design tools elevate structural information to the same level as other computationally supported disciplines. This improved ability to generate and communicate structural information enables a design team to better identify and meet structural design requirements, consider producibility issues earlier, and evaluate structural concepts. A design process experiment of a wing structural layout in collaboration with an industrial partner illustrates and validates the approach.

  1. Model-based design of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Stroehlein, Guido; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    In this work we present a general procedure for the model-based optimization of a polypeptide crude mixture purification process through its application to a case of industrial relevance. This is done to show how much modeling can be beneficial to optimize complex chromatographic processes in the industrial environment. The target peptide elution profile was modeled with a two sites adsorption equilibrium isotherm exhibiting two inflection points. The variation of the isotherm parameters with the modifier concentration was accounted for. The adsorption isotherm parameters of the target peptide were obtained by the inverse method. The elution of the impurities was approximated by lumping them into pseudo-impurities and by regressing their adsorption isotherm parameters directly as a function of the corresponding parameters of the target peptide. After model calibration and validation by comparison with suitable experimental data, Pareto optimizations of the process were carried out so as to select the optimal batch process.

  2. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  3. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  4. Design of interchannel MRF model for probabilistic multichannel image processing.

    PubMed

    Koo, Hyung Il; Cho, Nam Ik

    2011-03-01

    In this paper, we present a novel framework that exploits an informative reference channel in the processing of another channel. We formulate the problem as a maximum a posteriori estimation problem considering a reference channel and develop a probabilistic model encoding the interchannel correlations based on Markov random fields. Interestingly, the proposed formulation results in an image-specific and region-specific linear filter for each site. The strength of filter response can also be controlled in order to transfer the structural information of a channel to the others. Experimental results on satellite image fusion and chrominance image interpolation with denoising show that our method provides improved subjective and objective performance compared with conventional approaches.

  5. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  6. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  7. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  8. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  9. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  10. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  11. The Effect of Alternative Approaches to Design Instruction (Structural or Functional) on Students' Mental Models of Technological Design Processes

    ERIC Educational Resources Information Center

    Mioduser, David; Dagan, Osnat

    2007-01-01

    The study aimed to examine the relationship between alternative approaches towards problem solving/design teaching (structural or functional), students' mental modeling of the design process, and the quality of their solutions to design tasks. The "structural" approach emphasizes the need for an ordered and systematic learning of the design…

  12. An Analytic Process Model for Systems Design and Measurement

    DTIC Science & Technology

    1985-02-01

    ARSTAACT (fCQ0time on rew wao s if Ne iNp ernd dentify by block number) The objective of this model development effort is to provide a uniform, thorough... developed in great -. detail, applied in a sample fashion to an existing system (the Bradley Infantr Fighting Vehicle Training System) and a newly planned...APM for Development of Training System Requirements 94 E. The Computer-Aided Model 108 3 IV. APM APPLICATION PROCEDURES 118 V. RECOMMENDATIONS FOR

  13. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  14. How computer solid modeling is altering the process of detail design

    SciTech Connect

    Krueger, T.J.

    1995-12-31

    Computer solid modeling has enabled concurrent engineering to become the new approach to the process of detail design. It is possible to consider many facets of design, product testing and manufacturing simultaneously. Development of an integrated Computer-Aided Design and Manufacturing (CAD/CAM) data base from the computer solid model is the key element in communication and successful implementation of concurrent engineering. While generating a computer solid model, a design representation or model of the product is derived and a computer data base is being generated. This data base may be directly applied to all phases of design, analysis, prototyping, manufacturing, as well as marketing and packaging. This paper discusses how the data base generated from the solid model impacts the various processes of design.

  15. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  16. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  17. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  18. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  19. Model-based design space determination of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    Operating a chemical process at fixed operating conditions often leads to suboptimal process performances. It is important in fact to be able to vary the process operating conditions depending upon possible changes in feed composition, products requirements or economics. This flexibility in the manufacturing process was facilitated by the publication of the PAT initiative from the U.S. FDA [1]. In this work, the implementation of Quality-by-design in the development of a chromatographic purification process is discussed. A procedure to determine the design space of the process using chromatographic modeling is presented. Moreover, the risk of batch failure and the critical process parameters (CPP) are assessed by modeling. The ideal cut strategy is adopted and therefore only yield and productivity are considered as critical quality attributes (CQA). The general trends in CQA variations within the design space are discussed. The effect of process disturbances is also considered. It is shown that process disturbances significantly decrease the design space and that only simultaneous and specific changes in multiple process parameters (i.e. critical process parameters (CPP) lead to batch failure. The reliability of the obtained results is proven by comparing the model predictions to suitable experimental data. The case study presented in this work proves the reliability of process development using a model-based approach.

  20. Overall challenges in incorporating micro-mechanical models into materials design process

    NASA Astrophysics Data System (ADS)

    Bennoura, M.; Aboutajeddine, A.

    2016-10-01

    Using materials in engineering design has historically been handled using the paradigm of selecting appropriate materials from the finite set of available material databases. Recent trends, however, have moved toward the tailoring of materials that meet the overall system performance requirements, based on a process called material design. An important building block of this process is micromechanical models that relate microstructure to proprieties. Unfortunately, these models remain short and include a lot of uncertainties from assumptions and idealizations, which, unavoidably, impacts material design strategy. In this work, candidate methods to deal with micromechanical models uncertainties and their drawbacks in material design are investigated. Robust design methods for quantifying uncertainty and managing or mitigating its impact on design performances are reviewed first. These methods include principles for classifying uncertainty, mathematical techniques for evaluating its level degree, and design methods for performing and generating design alternatives, that are relatively insensitive to sources of uncertainty and flexible for admitting design changes or variations. The last section of this paper addresses the limits of the existing approaches from material modelling perspective and identifies the research opportunities to overcome the impediment of incorporating micromechanical models in material design process.

  1. Process modelling and die design concepts for forming aircraft sheet parts

    NASA Astrophysics Data System (ADS)

    Hatipoğlu, H. A.; Alkaş, C. O.

    2016-08-01

    This study is about typical sheet metal forming processes applied in aerospace industry including flexform, stretch form and stretch draw. Each process is modelled by using finite element method for optimization. Tensile, bulge, forming limit and friction tests of commonly used materials are conducted for defining the hardening curves, yield loci, anisotropic constants, forming limit curves and friction coefficients between die and sheet. Process specific loadings and boundary conditions are applied to each model. The models are then validated by smartly designed experiments that characterize the related forming processes. Lastly, several examples are given in which those models are used to predict the forming defects before physical forming and necessary die design and process parameter changes are applied accordingly for successful forming operations.

  2. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  3. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods.

  4. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  5. Studies in process modeling, design, monitoring, and control, with applications to polymer composites manufacturing

    NASA Astrophysics Data System (ADS)

    Srinivasagupta, Deepak

    2002-01-01

    High material and manufacturing costs have hindered the introduction of advanced polymer composite materials into mainstream civilian applications such as automotive. Even though high-fidelity models for several polymer composite manufacturing processes have become available over the past several years and offer significant benefits in manufacturing cost reduction, concerns about their inflexibility and maintenance has adversely affected their widespread usage. This research seeks to advance process modeling and design in polymer composites manufacturing to address these concerns. Other more general issues in measurement validation and distributed control are also addressed. Using a rigorous 3-D model of the injected pultrusion (IP) process validated recently, an algorithm was developed for process and equipment design with integrated economic, operability and environmental considerations. The optimum design promised enhanced throughput as well as reduction in the time and expenses of the current purely experimental approaches. Scale-up issues in IP were analyzed, and refinements to overcome some drawbacks in the model were suggested. The process model was then extended to simulate the co-injection resin transfer molding (CIRTM) process used for manufacture of foam-core sandwich composites. A 1-D isothermal model for real-time control was also developed. Process optimization using these models and experimental parametric studies increased the debond fracture toughness of sandwiches by 78% over current technology. To ensure the availability of validated measurements from process instrumentation, a novel in-situ sensor modeling approach to sensor validation was proposed. Both active and passive, time and frequency domain techniques were developed, and experimentally verified using temperature and flow sensors. A model-based dynamic estimator to predict the true measurement online was also validated. The effect of network communication delay on stability and control

  6. A frequency response model matching method for PID controller design for processes with dead-time.

    PubMed

    Anwar, Md Nishat; Pan, Somnath

    2015-03-01

    In this paper, a PID controller design method for the integrating processes based on frequency response matching is presented. Two approaches are proposed for the controller design. In the first approach, a double feedback loop configuration is considered where the inner loop is designed with a stabilizing gain. In the outer loop, the parameters of the PID controller are obtained by frequency response matching between the closed-loop system with the PID controller and a reference model with desired specifications. In the second approach, the design is directly carried out considering a desired load-disturbance rejection model of the system. In both the approaches, two low frequency points are considered for matching the frequency response, which yield linear algebraic equations, solution of which gives the controller parameters. Several examples are taken from the literature to demonstrate the effectiveness and to compare with some well known design methods. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  8. A Problem-Based Learning Model for Teaching the Instructional Design Business Acquisition Process.

    ERIC Educational Resources Information Center

    Kapp, Karl M.; Phillips, Timothy L.; Wanner, Janice H.

    2002-01-01

    Outlines a conceptual framework for using a problem-based learning model for teaching the Instructional Design Business Acquisition Process. Discusses writing a response to a request for proposal, developing a working prototype, orally presenting the solution, and the impact of problem-based learning on students' perception of their confidence in…

  9. Letter Report. Defense Waste Processing Facility Pour Spout Heaters - Conceptual Designs and Modeling

    SciTech Connect

    SK Sundaram; JM Perez, Jr.

    2000-09-06

    The Tanks Focus Area (TFA) identified a major task to address performance limitations and deficiencies of the Defense Waste Processing Facility (DWPF) now in its sixth year of operation. Design, installation, testing, monitoring, operability, and a number of other characteristics were studied by research personnel collaboratively at a number of facilities: Savannah River Technology Center (SRTC), Clemson Environmental Technologies Laboratory (CETL), Pacific Northwest National Laboratory (PNNL), and the Idaho National Engineering and Environmental Laboratory (INEEL). Because the potential limiting feature to the DWPF was identified as the pour spout/riser heater, researches on alternative design concepts originally proposed in the past were revisited. In the original works, finite element modeling was performed to evaluate temperature distribution and stress of the design currently used at the DWPF. Studies were also made to define the requirements of the design and to consider the approaches for remote removal/replacement. Their heater type/location, their remotely replaceable thermocouples, and their capabilities for remote handling characterized the five alternative designs proposed. Review comments on the alternative designs indicated a relatively wide range of advantages and disadvantages of the designs. The present report provides an overview of the design criteria, modeling results, and alternative designs. Based on a review of the past design optimization activities and an assessment of recent experience, recommendations are proposed for future consideration and improvement.

  10. A Conceptual Aerospace Vehicle Structural System Modeling, Analysis and Design Process

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek

    2007-01-01

    A process for aerospace structural concept analysis and design is presented, with examples of a blended-wing-body fuselage, a multi-bubble fuselage concept, a notional crew exploration vehicle, and a high altitude long endurance aircraft. Aerospace vehicle structures must withstand all anticipated mission loads, yet must be designed to have optimal structural weight with the required safety margins. For a viable systems study of advanced concepts, these conflicting requirements must be imposed and analyzed early in the conceptual design cycle, preferably with a high degree of fidelity. In this design process, integrated multidisciplinary analysis tools are used in a collaborative engineering environment. First, parametric solid and surface models including the internal structural layout are developed for detailed finite element analyses. Multiple design scenarios are generated for analyzing several structural configurations and material alternatives. The structural stress, deflection, strain, and margins of safety distributions are visualized and the design is improved. Over several design cycles, the refined vehicle parts and assembly models are generated. The accumulated design data is used for the structural mass comparison and concept ranking. The present application focus on the blended-wing-body vehicle structure and advanced composite material are also discussed.

  11. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity

  12. Error detection process - Model, design, and its impact on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y.-H.

    1984-01-01

    An analytical model is developed for computer error detection processes and applied to estimate their influence on system performance. Faults in the hardware, not in the design, are assumed to be the potential cause of transition to erroneous states during normal operations. The classification properties and associated recovery methods of error detection are discussed. The probability of obtaining an unreliable result is evaluated, along with the resulting computational loss. Error detection during design is considered and a feasible design space is outlined. Extension of the methods to account for the effects of extant multiple faults is indicated.

  13. Parameter-Level Data Flow Modeling Oriented to Product Design Process

    NASA Astrophysics Data System (ADS)

    Li, Shen; Shao, Xiao Dong; Zhang, Zhi Hua; Ge, Xiao Bo

    2015-12-01

    In this paper, a method of data flow modeling for a product design process oriented to data parameter is proposed. The data parameters are defined, which are classified as the basic data parameters and complex data parameters. The mechanism of the mapping relationship between different forms of documents and some basic data parameters, and a data transmission based on parameters, are constructed. Aiming at the characteristics of the iterative design process, the parameters version mechanism including node modification and iteration information is proposed. The data parameters transmission relationships are represented by a parameters network model (PNM) based on a directed graph. Finally, through the table of data parameters mapping onto the workflow node and PNM, the data ports and data links in the data flow model are generated automatically by the program. Validation in the 15-meter-diameter S/Ka frequency band antenna design process of the “Reflector, Back frame and Center part design” data flow model shows that the method can effectively shorten the time of data flow modeling and improve the data transmission efficiency.

  14. Ares Upper Stage Processes to Implement Model Based Design - Going Paperless

    NASA Technical Reports Server (NTRS)

    Gregory, Melanie

    2012-01-01

    Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.

  15. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  16. A Model of Creative Design Process for Fostering Creativity of Students in Design Education

    ERIC Educational Resources Information Center

    Wong, Yi Lin; Siu, Kin Wai Michael

    2012-01-01

    Creativity, which is concerned with problem solving, is essential if we are to generate new solutions to the massive and complex problems in the unknown future. Our next generation needs an educational platform where they can be taught to possess creativity. Design education is such a way to foster students' creativity. Therefore, it is essential…

  17. A Model of Creative Design Process for Fostering Creativity of Students in Design Education

    ERIC Educational Resources Information Center

    Wong, Yi Lin; Siu, Kin Wai Michael

    2012-01-01

    Creativity, which is concerned with problem solving, is essential if we are to generate new solutions to the massive and complex problems in the unknown future. Our next generation needs an educational platform where they can be taught to possess creativity. Design education is such a way to foster students' creativity. Therefore, it is essential…

  18. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation.

  19. Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process

    NASA Astrophysics Data System (ADS)

    Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph

    2012-08-01

    Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.

  20. Design process and tools for dynamic neuromechanical models and robot controllers.

    PubMed

    Szczecinski, Nicholas S; Hunt, Alexander J; Quinn, Roger D

    2017-02-01

    We present a serial design process with associated tools to select parameter values for a posture and locomotion controller for simulation of a robot. The controller is constructed from dynamic neuron and synapse models and simulated with the open-source neuromechanical simulator AnimatLab 2. Each joint has a central pattern generator (CPG), whose neurons possess persistent sodium channels. The CPG rhythmically inhibits motor neurons that control the servomotor's velocity. Sensory information coordinates the joints in the leg into a cohesive stepping motion. The parameter value design process is intended to run on a desktop computer, and has three steps. First, our tool FEEDBACKDESIGN uses classical control methods to find neural and synaptic parameter values that stably and robustly control servomotor output. This method is fast, testing over 100 parameter value variations per minute. Next, our tool CPGDESIGN generates bifurcation diagrams and phase response curves for the CPG model. This reveals neural and synaptic parameter values that produce robust oscillation cycles, whose phase can be rapidly entrained to sensory feedback. It also designs the synaptic conductance of inter-joint pathways. Finally, to understand sensitivity to parameters and how descending commands affect a leg's stepping motion, our tool SIMSCAN runs batches of neuromechanical simulations with specified parameter values, which is useful for searching the parameter space of a complicated simulation. These design tools are demonstrated on a simulation of a robot, but may be applied to neuromechanical animal models or physical robots as well.

  1. Physics-based process modeling, reliability prediction, and design guidelines for flip-chip devices

    NASA Astrophysics Data System (ADS)

    Michaelides, Stylianos

    Flip Chip on Board (FCOB) and Chip-Scale Packages (CSPs) are relatively new technologies that are being increasingly used in the electronic packaging industry. Compared to the more widely used face-up wirebonding and TAB technologies, flip-chips and most CSPs provide the shortest possible leads, lower inductance, higher frequency, better noise control, higher density, greater input/output (I/O), smaller device footprint and lower profile. However, due to the short history and due to the introduction of several new electronic materials, designs, and processing conditions, very limited work has been done to understand the role of material, geometry, and processing parameters on the reliability of flip-chip devices. Also, with the ever-increasing complexity of semiconductor packages and with the continued reduction in time to market, it is too costly to wait until the later stages of design and testing to discover that the reliability is not satisfactory. The objective of the research is to develop integrated process-reliability models that will take into consideration the mechanics of assembly processes to be able to determine the reliability of face-down devices under thermal cycling and long-term temperature dwelling. The models incorporate the time and temperature-dependent constitutive behavior of various materials in the assembly to be able to predict failure modes such as die cracking and solder cracking. In addition, the models account for process-induced defects and macro-micro features of the assembly. Creep-fatigue and continuum-damage mechanics models for the solder interconnects and fracture-mechanics models for the die have been used to determine the reliability of the devices. The results predicted by the models have been successfully validated against experimental data. The validated models have been used to develop qualification and test procedures for implantable medical devices. In addition, the research has helped develop innovative face

  2. A fuzzy model based adaptive PID controller design for nonlinear and uncertain processes.

    PubMed

    Savran, Aydogan; Kahraman, Gokalp

    2014-03-01

    We develop a novel adaptive tuning method for classical proportional-integral-derivative (PID) controller to control nonlinear processes to adjust PID gains, a problem which is very difficult to overcome in the classical PID controllers. By incorporating classical PID control, which is well-known in industry, to the control of nonlinear processes, we introduce a method which can readily be used by the industry. In this method, controller design does not require a first principal model of the process which is usually very difficult to obtain. Instead, it depends on a fuzzy process model which is constructed from the measured input-output data of the process. A soft limiter is used to impose industrial limits on the control input. The performance of the system is successfully tested on the bioreactor, a highly nonlinear process involving instabilities. Several tests showed the method's success in tracking, robustness to noise, and adaptation properties. We as well compared our system's performance to those of a plant with altered parameters with measurement noise, and obtained less ringing and better tracking. To conclude, we present a novel adaptive control method that is built upon the well-known PID architecture that successfully controls highly nonlinear industrial processes, even under conditions such as strong parameter variations, noise, and instabilities.

  3. New process modeling [sic], design, and control strategies for energy efficiency, high product quality, and improved productivity in the process industries. Final project report

    SciTech Connect

    Ray, W. Harmon

    2002-06-05

    This project was concerned with the development of process design and control strategies for improving energy efficiency, product quality, and productivity in the process industries. In particular, (i) the resilient design and control of chemical reactors, and (ii) the operation of complex processing systems, was investigated. Specific topics studied included new process modeling procedures, nonlinear controller designs, and control strategies for multiunit integrated processes. Both fundamental and immediately applicable results were obtained. The new design and operation results from this project were incorporated into computer-aided design software and disseminated to industry. The principles and design procedures have found their way into industrial practice.

  4. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  5. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases [1]. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission [2]. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  6. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  7. Design and tuning of standard additive model based fuzzy PID controllers for multivariable process systems.

    PubMed

    Harinath, Eranda; Mann, George K I

    2008-06-01

    This paper describes a design and two-level tuning method for fuzzy proportional-integral derivative (FPID) controllers for a multivariable process where the fuzzy inference uses the inference of standard additive model. The proposed method can be used for any n x n multi-input-multi-output process and guarantees closed-loop stability. In the two-level tuning scheme, the tuning follows two steps: low-level tuning followed by high-level tuning. The low-level tuning adjusts apparent linear gains, whereas the high-level tuning changes the nonlinearity in the normalized fuzzy output. In this paper, two types of FPID configurations are considered, and their performances are evaluated by using a real-time multizone temperature control problem having a 3 x 3 process system.

  8. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  9. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases.1 Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission.2 Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an Advanced Design Methods (ADM) based approach. This approach applies the concepts of Design of Experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development e ort. In order to t a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  10. Process modeling and supply chain design for advanced biofuel production based on bio-oil gasification

    NASA Astrophysics Data System (ADS)

    Li, Qi

    As a potential substitute for petroleum-based fuel, second generation biofuels are playing an increasingly important role due to their economic, environmental, and social benefits. With the rapid development of biofuel industry, there has been an increasing literature on the techno-economic analysis and supply chain design for biofuel production based on a variety of production pathways. A recently proposed production pathway of advanced biofuel is to convert biomass to bio-oil at widely distributed small-scale fast pyrolysis plants, then gasify the bio-oil to syngas and upgrade the syngas to transportation fuels in centralized biorefinery. This thesis aims to investigate two types of assessments on this bio-oil gasification pathway: techno-economic analysis based on process modeling and literature data; supply chain design with a focus on optimal decisions for number of facilities to build, facility capacities and logistic decisions considering uncertainties. A detailed process modeling with corn stover as feedstock and liquid fuels as the final products is presented. Techno-economic analysis of the bio-oil gasification pathway is also discussed to assess the economic feasibility. Some preliminary results show a capital investment of 438 million dollar and minimum fuel selling price (MSP) of $5.6 per gallon of gasoline equivalent. The sensitivity analysis finds that MSP is most sensitive to internal rate of return (IRR), biomass feedstock cost, and fixed capital cost. A two-stage stochastic programming is formulated to solve the supply chain design problem considering uncertainties in biomass availability, technology advancement, and biofuel price. The first-stage makes the capital investment decisions including the locations and capacities of the decentralized fast pyrolysis plants and the centralized biorefinery while the second-stage determines the biomass and biofuel flows. The numerical results and case study illustrate that considering uncertainties can be

  11. Lyophilization process design space.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2013-11-01

    The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  12. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  13. The Sulfur-Iodine Cycle: Process Analysis and Design Using Comprehensive Phase Equilibrium Measurements and Modeling

    SciTech Connect

    Thies, Mark C.; O'Connell, J. P.; Gorensek, Maximilian B.

    2010-01-10

    Of the 100+ thermochemical hydrogen cycles that have been proposed, the Sulfur-Iodine (S-I) Cycle is a primary target of international interest for the centralized production of hydrogen from nuclear power. However, the cycle involves complex and highly nonideal phase behavior at extreme conditions that is only beginning to be understood and modeled for process simulation. The consequence is that current designs and efficiency projections have large uncertainties, as they are based on incomplete data that must be extrapolated from property models. This situation prevents reliable assessment of the potential viability of the system and, even more, a basis for efficient process design. The goal of this NERI award (05-006) was to generate phase-equilibrium data, property models, and comprehensive process simulations so that an accurate evaluation of the S-I Cycle could be made. Our focus was on Section III of the Cycle, where the hydrogen is produced by decomposition of hydroiodic acid (HI) in the presence of water and iodine (I2) in a reactive distillation (RD) column. The results of this project were to be transferred to the nuclear hydrogen community in the form of reliable flowsheet models for the S-I process. Many of the project objectives were achieved. At Clemson University, a unique, tantalum-based, phase-equilibrium apparatus incorporating a view cell was designed and constructed for measuring fluid-phase equilibria for mixtures of iodine, HI, and water (known as HIx) at temperatures to 350 °C and pressures to 100 bar. Such measurements were of particular interest for developing a working understanding of the expected operation of the RD column in Section III. The view cell allowed for the IR observation and discernment of vapor-liquid (VL), liquid-liquid, and liquid-liquid-vapor (LLVE) equilibria for HIx systems. For the I2-H2O system, liquid-liquid equilibrium (LLE) was discovered to exist at temperatures up to 310-315 °C, in contrast to the models and

  14. Statistics-enhanced multistage process models for integrated design &manufacturing of poly (vinyl alcohol) treated buckypaper

    NASA Astrophysics Data System (ADS)

    Wang, Kan

    Carbon nanotube (CNT) is considered a promising engineering material because of its exceptional mechanical, electrical, and thermal properties. Buckypaper (BP), a thin sheet of assembled CNTs, is an effective way to handle CNTs in macro scale. Pristine BP is a fragile material which is held together by weak van der Waals attractions among CNTs. This dissertation introduces a modified filtration based manufacturing process which uses poly (vinyl alcohol) (PVA) to treat BP. This treatment greatly improves the handleability of BP, reduces the spoilage during transferring, and shortens the production time. The multistage manufacturing process of PVA-treated BP is discussed in this dissertation, and process models are developed to predict the nanostructure of final products from the process parameters. Based on the nanostructure, a finite element based physical model for prediction of Young's modulus is also developed. This accuracy of this physical model is further improved by statistical methods. The aim of this study is to investigate and improve the scalability of the manufacturing process of PVA-treated BP. To achieve this goal, various statistical tools are employed. The unique issues in nanomanufacturing also motivate the development of new statistical tools and modification of existing tools. Those issues include the uncertainties in nanostructure characterization due to the scale, limited number experimental data due to high cost of raw materials, large variation in final product due to the random nature in structure, and the high complexity in physical models due to the small scale of structural building blocks. This dissertation addresses those issues by combining engineering field knowledge and statistical methods. The resulting statistics-enhanced physical model provides an approach to design the manufacturing process of PVA-treated BP for a targeting property and tailor the robustness of the final product by manipulating the process parameters. In addition

  15. Optimization of Spectralon through numerical modeling and improved processes and designs

    NASA Astrophysics Data System (ADS)

    Chang, Bob Y.; Huppe, Ronald M.; Chase, Christina; D'Amato, Dante P.

    2007-09-01

    The demand for progressively more powerful lasers has caused those employing side-pumped laser designs to become acutely aware of pumping efficiency and performance. Additionally, precision applications demand beam stability and uniformity for the lifetime of the laser flash lamp. The use of highly diffuse, high reflectance pump chamber reflectors such as Spectralon (R)‡ have been shown to amplify overall power and performance. Spectralon is used in a wide range of side-pumped applications for its superior optical characteristics and design flexibility but stated damage thresholds of approximately 4 J/cm2 have limited it to lower power applications. To increase energy tolerances, initial damage thresholds are defined through mathematical simulation. A general form of the heat equation is studied numerically to develop a theoretical model of Spectralon's damage threshold. The heat equation is discretized using the Euler method. Secondly, process modifications are performed to test for increased material durability and to physically reproduce initially defined theoretical parameters.

  16. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in

  17. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  18. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering.

    PubMed

    Shanechi, Maryam M; Orsborn, Amy L; Carmena, Jose M

    2016-04-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain's behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user's motor intention during CLDA-a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter

  19. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  20. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  1. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  2. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  3. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  4. Gaussian process based modeling and experimental design for sensor calibration in drifting environments.

    PubMed

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2015-09-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor's response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP's inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method.

  5. Thermal system design and modeling of meniscus controlled silicon growth process for solar applications

    NASA Astrophysics Data System (ADS)

    Wang, Chenlei

    The direct conversion of solar radiation to electricity by photovoltaics has a number of significant advantages as an electricity generator. That is, solar photovoltaic conversion systems tap an inexhaustible resource which is free of charge and available anywhere in the world. Roofing tile photovoltaic generation, for example, saves excess thermal heat and preserves the local heat balance. This means that a considerable reduction of thermal pollution in densely populated city areas can be attained. A semiconductor can only convert photons with the energy of the band gap with good efficiency. It is known that silicon is not at the maximum efficiency but relatively close to it. There are several main parts for the photovoltaic materials, which include, single- and poly-crystalline silicon, ribbon silicon, crystalline thin-film silicon, amorphous silicon, copper indium diselenide and related compounds, cadmium telluride, et al. In this dissertation, we focus on melt growth of the single- and poly-crystalline silicon manufactured by Czochralski (Cz) crystal growth process, and ribbon silicon produced by the edge-defined film-fed growth (EFG) process. These two methods are the most commonly used techniques for growing photovoltaic semiconductors. For each crystal growth process, we introduce the growth mechanism, growth system design, general application, and progress in the numerical simulation. Simulation results are shown for both Czochralski and EFG systems including temperature distribution of the growth system, velocity field inside the silicon melt and electromagnetic field for the EFG growth system. Magnetic field is applied on Cz system to reduce the melt convection inside crucible and this has been simulated in our numerical model. Parametric studies are performed through numerical and analytical models to investigate the relationship between heater power levels and solidification interface movement and shape. An inverse problem control scheme is developed to

  6. Extracting insights from electronic health records: case studies, a visual analytics process model, and design recommendations.

    PubMed

    Wang, Taowei David; Wongsuphasawat, Krist; Plaisant, Catherine; Shneiderman, Ben

    2011-10-01

    Current electronic health record (EHR) systems facilitate the storage, retrieval, persistence, and sharing of patient data. However, the way physicians interact with EHRs has not changed much. More specifically, support for temporal analysis of a large number of EHRs has been lacking. A number of information visualization techniques have been proposed to alleviate this problem. Unfortunately, due to their limited application to a single case study, the results are often difficult to generalize across medical scenarios. We present the usage data of Lifelines2 (Wang et al. 2008), our information visualization system, and user comments, both collected over eight different medical case studies. We generalize our experience into a visual analytics process model for multiple EHRs. Based on our analysis, we make seven design recommendations to information visualization tools to explore EHR systems.

  7. The PhOCoe Model--ergonomic pattern mapping in participatory design processes.

    PubMed

    Silva e Santos, Marcello

    2012-01-01

    The discipline and practice of human factors and ergonomics is quite rich in terms of the availability of analysis, development and evaluation tools and methods for its various processes. However, we lack effective instruments to either map or regulate comprehensively and effectively, cognitive and organizational related impacts, especially the environmental ones. Moreover, when ergonomic transformations through design - such as a new workstation design or even an entire new facility - is at play, ergonomics professionals tend to stay at bay, relying solely on design professionals and engineers. There is vast empirical evidence showing that participation of ergonomists as project facilitators, may contribute to an effective professional synergy amongst the various stakeholders in a multidisciplinary venue. When that happens, everyone wins - users and designers alike -because eventual conflicts, raised up in the midst of options selection, are dissipated in exchange for more convergent design alternatives. This paper presents a method for participatory design, in which users are encouraged to actively participate in the whole design process by sharing their real work activities with the design team. The negotiated results inferred from the ergonomic action and translated into a new design, are then compiled into a "Ergonomic Pattern Manual". This handbook of ergonomics-oriented design guidelines contains essential guidelines to be consulted in recurrent design project situations in which similar patterns might be used. The main drive is simple: nobody knows better than workers themselves what an adequate workplace design solution (equipment, workstation, office layout) should be.

  8. Standardization Process for Space Radiation Models Used for Space System Design

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Daly, Eamonn; Brautigam, Donald

    2005-01-01

    The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.

  9. Automating the automobile design process

    SciTech Connect

    Smith, M.R.

    1986-03-01

    Traditional CAD/CAM speeds product design, analysis, and manufacturing by giving engineers and designers the ability to view and manipulate computer models of automobiles from a variety of perspectives, such as interiors, exteriors, and cross sections. Computer-aided styling (CAS) hastens the automobile design process in the same manner by allowing data to be captured earlier in the design cycle. The goal of CAS is to be able to determine in advance if a design can be aesthetically pleasing - without having to build even the first prototype. Just like CAD/CAM, styling is an iterative process, with CAS techniques speeding the design. Faster iterations mean that more designs can be reviewed and that designers can react more quickly to changing market trends.

  10. Modeling hospital surgical delivery process design using system simulation: optimizing patient flow and bed capacity as an illustration.

    PubMed

    Kumar, Sameer

    2011-01-01

    It is increasingly recognized that hospital operation is an intricate system with limited resources and many interacting sources of both positive and negative feedback. The purpose of this study is to design a surgical delivery process in a county hospital in the U.S where patient flow through a surgical ward is optimized. The system simulation modeling is used to address questions of capacity planning, throughput management and interacting resources which constitute the constantly changing complexity that characterizes designing a contemporary surgical delivery process in a hospital. The steps in building a system simulation model is demonstrated using an example of building a county hospital in a small city in the US. It is used to illustrate a modular system simulation modeling of patient surgery process flows. The system simulation model development will enable planners and designers how they can build in overall efficiencies in a healthcare facility through optimal bed capacity for peak patient flow of emergency and routine patients.

  11. A mathematical examination of the press model for atmospheric turbulence. [aircraft design/random processes

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1975-01-01

    The random process used to model atmospheric turbulence in aircraft response problems is examined. The first, second, and higher order probability density and characteristic functions were developed. The concepts of the Press model lead to an approximate procedure for the analysis of the response of linear dynamic systems to a class of non-Gaussian random processes. The Press model accounts for both the Gaussian and non-Gaussian forms of measured turbulence data. The nonstationary aspects of measured data are explicitly described by the transition properties of the random process. The effects of the distribution of the intensity process upon calculated exceedances are examined. It is concluded that the press model with a Gaussian intensity distribution gives a conservative prediction of limit load values.

  12. Optimized aerodynamic design process for subsonic transport wing fitted with winglets. [wind tunnel model

    NASA Technical Reports Server (NTRS)

    Kuhlman, J. M.

    1979-01-01

    The aerodynamic design of a wind-tunnel model of a wing representative of that of a subsonic jet transport aircraft, fitted with winglets, was performed using two recently developed optimal wing-design computer programs. Both potential flow codes use a vortex lattice representation of the near-field of the aerodynamic surfaces for determination of the required mean camber surfaces for minimum induced drag, and both codes use far-field induced drag minimization procedures to obtain the required spanloads. One code uses a discrete vortex wake model for this far-field drag computation, while the second uses a 2-D advanced panel wake model. Wing camber shapes for the two codes are very similar, but the resulting winglet camber shapes differ widely. Design techniques and considerations for these two wind-tunnel models are detailed, including a description of the necessary modifications of the design geometry to format it for use by a numerically controlled machine for the actual model construction.

  13. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  14. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  15. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  16. Self-optimisation and model-based design of experiments for developing a C–H activation flow process

    PubMed Central

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek

    2017-01-01

    A recently described C(sp3)–H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model. PMID:28228856

  17. Self-optimisation and model-based design of experiments for developing a C-H activation flow process.

    PubMed

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei

    2017-01-01

    A recently described C(sp(3))-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  18. A design process for using normative models in shared decision making: a case study in the context of prenatal testing.

    PubMed

    Rapaport, Sivan; Leshno, Moshe; Fink, Lior

    2014-12-01

    Shared decision making (SDM) encourages the patient to play a more active role in the process of medical consultation and its primary objective is to find the best treatment for a specific patient. Recent findings, however, show that patient preferences cannot be easily or accurately judged on the basis of communicative exchange during routine office visits, even for patients who seek to expand their role in medical decision making (MDM). The objective of this study is to improve the quality of patient-physician communication by developing a novel design process for SDM and then demonstrating, through a case study, the applicability of this process in enabling the use of a normative model for a specific medical situation. Our design process goes through the following stages: definition of medical situation and decision problem, development/identification of normative model, adaptation of normative model, empirical analysis and development of decision support systems (DSS) tools that facilitate the SDM process in the specific medical situation. This study demonstrates the applicability of the process through the implementation of the general normative theory of MDM under uncertainty for the medical-financial dilemma of choosing a physician to perform amniocentesis. The use of normative models in SDM raises several issues, such as the goal of the normative model, the relation between the goals of prediction and recommendation, and the general question of whether it is valid to use a normative model for people who do not behave according to the model's assumptions. © 2012 John Wiley & Sons Ltd.

  19. A process: development of a model multiculturalism curriculum designed for mobility across geographic borders.

    PubMed

    Frels, L; Scott, J; Schramm, M A

    1997-08-01

    The Council on Accreditation Project, Nurse Anesthesia Educational Requirements and Mobility Between North American Free Trade Agreement (NAFTA) Countries, has as one of its outcomes the development of a model curriculum that would minimize educational barriers for mobility of nurse anesthetists across NAFTA geographical borders with a focus on the blending of professional and technical expertise with issues of human diversity and/or cultural differences. The overall long-term outcome of the project is to test a process. The manuscript discusses the process used in year III of the project to integrate cultural concepts into a nurse anesthesia model curriculum.

  20. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quality-by-Design: Multivariate Model for Multicomponent Quantification in Refining Process of Honey

    PubMed Central

    Li, Xiaoying; Wu, Zhisheng; Feng, Xin; Liu, Shanshan; Yu, Xiaojie; Ma, Qun; Qiao, Yanjiang

    2017-01-01

    Objective: A method for rapid analysis of the refining process of honey was developed based on near-infrared (NIR) spectroscopy. Methods: Partial least square calibration models were built for the four components after the selection of the optimal spectral pretreatment method and latent factors. Results: The models covered the samples of different temperatures and time points, therefore the models were robust and universal. Conclusions: These results highlighted that the NIR technology could extract the information of critical process and provide essential process knowledge of the honey refining process. SUMMARY A method for rapid analysis of the refining process of honey was developed based on near-infrared (NIR) spectroscopy. Abbreviation used: NIR: Near-infrared; 5-HMF: 5-hydroxymethylfurfural; RMSEP: Root mean square error of prediction; R: correlation coefficients; PRESS: prediction residual error-sum squares; TCM: Traditional Chinese medicine; HPLC: High-performance liquid chromatography; HPLC-DAD: HPLC-diode array detector; PLS: Partial least square; MSC: multiplicative scatter correction; RMSECV: Root mean square error of cross validation; RPD: Residual predictive deviation; 1D: 1st order derivative; SG: Savitzky-Golay smooth; 2D: 2nd order derivative. PMID:28216906

  2. Incorporating Eco-Evolutionary Processes into Population Models:Design and Applications

    EPA Science Inventory

    Eco-evolutionary population models are powerful new tools for exploring howevolutionary processes influence plant and animal population dynamics andvice-versa. The need to manage for climate change and other dynamicdisturbance regimes is creating a demand for the incorporation of...

  3. Incorporating Eco-Evolutionary Processes into Population Models:Design and Applications

    EPA Science Inventory

    Eco-evolutionary population models are powerful new tools for exploring howevolutionary processes influence plant and animal population dynamics andvice-versa. The need to manage for climate change and other dynamicdisturbance regimes is creating a demand for the incorporation of...

  4. Design of RTDA controller for industrial process using SOPDT model with minimum or non-minimum zero.

    PubMed

    Anbarasan, K; Srinivasan, K

    2015-07-01

    This research paper focuses on the design and development of simplified RTDA control law computation formulae for SOPDT process with minimum or non-minimum zero. The design of RTDA control scheme consists of three main components namely process output prediction, model prediction update and control action computation. The systematic approach for computation of the above three components for SOPDT process with minimum or non-minimum zero is developed in this paper. The design, implementation and performance evaluation of the developed controller is demonstrated via simulation examples. The closed loop equation, block diagram representation and theoretical stability derivation for RTDA controller are developed. The performance of proposed controller is compared with IMC, SPC, MPC and PID controller and it is demonstrated on Industrial non-linear CSTR process. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Model Guided Design and Development Process for an Electronic Health Record Training Program

    PubMed Central

    He, Ze; Marquard, Jenna; Henneman, Elizabeth

    2016-01-01

    Effective user training is important to ensure electronic health record (EHR) implementation success. Though many previous studies report best practice principles and success and failure stories, current EHR training is largely empirically-based and often lacks theoretical guidance. In addition, the process of training development is underemphasized and underreported. A white paper by the American Medical Informatics Association called for models of user training for clinical information system implementation; existing instructional development models from learning theory provide a basis to meet this call. We describe in this paper our experiences and lessons learned as we adapted several instructional development models to guide our development of EHR user training. Specifically, we focus on two key aspects of this training development: training content and training process. PMID:28269940

  6. Model Guided Design and Development Process for an Electronic Health Record Training Program.

    PubMed

    He, Ze; Marquard, Jenna; Henneman, Elizabeth

    2016-01-01

    Effective user training is important to ensure electronic health record (EHR) implementation success. Though many previous studies report best practice principles and success and failure stories, current EHR training is largely empirically-based and often lacks theoretical guidance. In addition, the process of training development is underemphasized and underreported. A white paper by the American Medical Informatics Association called for models of user training for clinical information system implementation; existing instructional development models from learning theory provide a basis to meet this call. We describe in this paper our experiences and lessons learned as we adapted several instructional development models to guide our development of EHR user training. Specifically, we focus on two key aspects of this training development: training content and training process.

  7. Fracture design modelling

    SciTech Connect

    Crichlow, H.B.; Crichlow, H.B.

    1980-02-07

    A design tool is discussed whereby the various components that enter the design process of a hydraulic fracturing job are combined to provide a realistic appraisal of a stimulation job in the field. An interactive computer model is used to solve the problem numerically to obtain the effects of various parameters on the overall behavior of the system.

  8. Testing the Theoretical Design of a Health Risk Message: Reexamining the Major Tenets of the Extended Parallel Process Model

    ERIC Educational Resources Information Center

    Gore, Thomas D.; Bracken, Cheryl Campanella

    2005-01-01

    This study examined the fear control/danger control responses that are predicted by the Extended Parallel Process Model (EPPM). In a campaign designed to inform college students about the symptoms and dangers of meningitis, participants were given either a high-threat/no-efficacy or high-efficacy/no-threat health risk message, thus testing the…

  9. Testing the Theoretical Design of a Health Risk Message: Reexamining the Major Tenets of the Extended Parallel Process Model

    ERIC Educational Resources Information Center

    Gore, Thomas D.; Bracken, Cheryl Campanella

    2005-01-01

    This study examined the fear control/danger control responses that are predicted by the Extended Parallel Process Model (EPPM). In a campaign designed to inform college students about the symptoms and dangers of meningitis, participants were given either a high-threat/no-efficacy or high-efficacy/no-threat health risk message, thus testing the…

  10. Paediatric CT protocol optimisation: a design of experiments to support the modelling and optimisation process.

    PubMed

    Rani, K; Jahnen, A; Noel, A; Wolf, D

    2015-07-01

    In the last decade, several studies have emphasised the need to understand and optimise the computed tomography (CT) procedures in order to reduce the radiation dose applied to paediatric patients. To evaluate the influence of the technical parameters on the radiation dose and the image quality, a statistical model has been developed using the design of experiments (DOE) method that has been successfully used in various fields (industry, biology and finance) applied to CT procedures for the abdomen of paediatric patients. A Box-Behnken DOE was used in this study. Three mathematical models (contrast-to-noise ratio, noise and CTDI vol) depending on three factors (tube current, tube voltage and level of iterative reconstruction) were developed and validated. They will serve as a basis for the development of a CT protocol optimisation model.

  11. Enhancing the Design Process for Complex Space Systems through Early Integration of Risk and Variable-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri; Osburg, Jan

    2005-01-01

    An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.

  12. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  13. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  14. Using the Knowledge, Process, Practice (KPP) model for driving the design and development of online postgraduate medical education.

    PubMed

    Shaw, Tim; Barnet, Stewart; Mcgregor, Deborah; Avery, Jennifer

    2015-01-01

    Online learning is a primary delivery method for continuing health education programs. It is critical that programs have curricula objectives linked to educational models that support learning. Using a proven educational modelling process ensures that curricula objectives are met and a solid basis for learning and assessment is achieved. To develop an educational design model that produces an educationally sound program development plan for use by anyone involved in online course development. We have described the development of a generic educational model designed for continuing health education programs. The Knowledge, Process, Practice (KPP) model is founded on recognised educational theory and online education practice. This paper presents a step-by-step guide on using this model for program development that encases reliable learning and evaluation. The model supports a three-step approach, KPP, based on learning outcomes and supporting appropriate assessment activities. It provides a program structure for online or blended learning that is explicit, educationally defensible, and supports multiple assessment points for health professionals. The KPP model is based on best practice educational design using a structure that can be adapted for a variety of online or flexibly delivered postgraduate medical education programs.

  15. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  16. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  17. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling

    PubMed Central

    F. Pradier, Melanie; J. R. Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners’ performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  18. p-Nitrophenol degradation by electro-Fenton process: Pathway, kinetic model and optimization using central composite design.

    PubMed

    Meijide, J; Rosales, E; Pazos, M; Sanromán, M A

    2017-10-01

    The chemical process scale-up, from lab studies to industrial production, is challenging and requires deep knowledge of the kinetic model and the reactions that take place in the system. This knowledge is also useful in order to be employed for the reactor design and the determination of the optimal operational conditions. In this study, a model substituted phenol such as p-nitrophenol was degraded by electro-Fenton process and the reaction products yielded along the treatment were recorded. The kinetic model was developed using Matlab software and was based on main reactions that occurred until total mineralization which allowed predicting the degradation pathway under this advanced oxidation process. The predicted concentration profiles of p-nitrophenol, their intermediates and by-products in electro-Fenton process were validated with experimental assays and the results were consistent. Finally, based on the developed kinetic model the degradation process was optimized using central composite design taking as key parameters the ferrous ion concentration and current density. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Estimation of design space for an extrusion-spheronization process using response surface methodology and artificial neural network modelling.

    PubMed

    Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza

    2016-09-01

    The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented

  20. NNARX model structure for the purposes of controller design and optimization of heat exchanger process control training system operation

    NASA Astrophysics Data System (ADS)

    Mulyana, Tatang

    2017-04-01

    This paper presents a performance of Neural Network Autoregressive with Exogenous Input (NNARX) model structure and evaluates the training data that provides robust model on fresh data set, using neural network type of back-propagation known as multilayer perceptron (MPP). The plant under test is a heat exchanger process control training system called QAD Model BDT 921. A real input-output data has been collected and will be used to identify the plant. The model was estimated by prediction error method with Levenberg-Marquardt algorithm for training neural networks. It is expected that the training data covering the full operating condition will be the optimum training data. The model was validated by residual analysis and model fit. It will be presented and concluded. The simulation results show that the identification is able to identify plant's good model. This identification can be used to design the plant controller and improve its performance.

  1. Model-based control structure design of a full-scale WWTP under the retrofitting process.

    PubMed

    Machado, V C; Lafuente, J; Baeza, J A

    2015-01-01

    The anoxic-oxic (A/O) municipal wastewater treatment plant (WWTP) of Manresa (Catalonia, Spain) was studied for a possible conversion to an anaerobic/anoxic/oxic (A2/O) configuration to promote enhanced biological phosphorus removal. The control structure had to be redesigned to satisfy the new necessity to control phosphorus concentration, besides ammonium and nitrate concentrations (main pollutant concentrations). Thereby, decentralized control structures with proportional-integral-derivative (PID) controllers and centralized control structures with model-predictive controllers (MPC) were designed and tested. All the designed control structures had their performance systematically tested regarding effluent quality and operating costs. The centralized control structure, A2/O-3-MPC, achieved the lowest operating costs with the best effluent quality using the A2/O plant configuration for the Manresa WWTP. The controlled variables used in this control structure were ammonium in the effluent, nitrate at the end of the anoxic zone and phosphate at the end of the anaerobic zone, while the manipulated variables were the internal and external recycle flow rates and the dissolved oxygen setpoint in the aerobic reactors.

  2. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    NASA Astrophysics Data System (ADS)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These

  3. NASA Collaborative Design Processes

    NASA Technical Reports Server (NTRS)

    Jones, Davey

    2017-01-01

    This is Block 1, the first evolution of the world's most powerful and versatile rocket, the Space Launch System, built to return humans to the area around the moon. Eventually, larger and even more powerful and capable configurations will take astronauts and cargo to Mars. On the sides of the rocket are the twin solid rocket boosters that provide more than 75 percent during liftoff and burn for about two minutes, after which they are jettisoned, lightening the load for the rest of the space flight. Four RS-25 main engines provide thrust for the first stage of the rocket. These are the world's most reliable rocket engines. The core stage is the main body of the rocket and houses the fuel for the RS-25 engines, liquid hydrogen and liquid oxygen, and the avionics, or "brain" of the rocket. The core stage is all new and being manufactured at NASA's "rocket factory," Michoud Assembly Facility near New Orleans. The Launch Vehicle Stage Adapter, or LVSA, connects the core stage to the Interim Cryogenic Propulsion Stage. The Interim Cryogenic Propulsion Stage, or ICPS, uses one RL-10 rocket engine and will propel the Orion spacecraft on its deep-space journey after first-stage separation. Finally, the Orion human-rated spacecraft sits atop the massive Saturn V-sized launch vehicle. Managed out of Johnson Space Center in Houston, Orion is the first spacecraft in history capable of taking humans to multiple destinations within deep space. 2) Each element of the SLS utilizes collaborative design processes to achieve the incredible goal of sending human into deep space. Early phases are focused on feasibility and requirements development. Later phases are focused on detailed design, testing, and operations. There are 4 basic phases typically found in each phase of development.

  4. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Matthew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design can have a profound impact on life-cycle cost (LCC). Widely accepted that nearly 80% of LCC is committed. Decisions made during early design must be well informed. Advanced Concepts Office (ACO) at Marshall Space Flight Center aids in decision making for launch vehicles. Provides rapid turnaround pre-phase A and phase A studies. Provides customer with preliminary vehicle sizing information, vehicle feasibility, and expected performance.

  5. Analysis and simulation of industrial distillation processes using a graphical system design model

    NASA Astrophysics Data System (ADS)

    Boca, Maria Loredana; Dobra, Remus; Dragos, Pasculescu; Ahmad, Mohammad Ayaz

    2016-12-01

    The separation column used for experimentations one model can be configured in two ways: one - two columns of different diameters placed one within the other extension, and second way, one column with set diameter [1], [2]. The column separates the carbon isotopes based on the cryogenic distillation of pure carbon monoxide, which is fed at a constant flow rate as a gas through the feeding system [1],[2]. Based on numerical control systems used in virtual instrumentation was done some simulations of the distillation process in order to obtain of the isotope 13C at high concentrations. The experimental installation for cryogenic separation can be configured from the point of view of the separation column in two ways: Cascade - two columns of different diameters and placed one in the extension of the other column, and second one column with a set diameter. It is proposed that this installation is controlled to achieve data using a data acquisition tool and professional software that will process information from the isotopic column based on a logical dedicated algorithm. Classical isotopic column will be controlled automatically, and information about the main parameters will be monitored and properly display using one program. Take in consideration the very-low operating temperature, an efficient thermal isolation vacuum jacket is necessary. Since the "elementary separation ratio" [2] is very close to unity in order to raise the (13C) isotope concentration up to a desired level, a permanent counter current of the liquid-gaseous phases of the carbon monoxide is created by the main elements of the equipment: the boiler in the bottom-side of the column and the condenser in the top-side.

  6. A two-step approach for fluidized bed granulation in pharmaceutical processing: Assessing different models for design and control

    PubMed Central

    Ming, Liangshan; Li, Zhe; Wu, Fei; Du, Ruofei; Feng, Yi

    2017-01-01

    Various modeling techniques were used to understand fluidized bed granulation using a two-step approach. First, Plackett-Burman design (PBD) was used to identify the high-risk factors. Then, Box-Behnken design (BBD) was used to analyze and optimize those high-risk factors. The relationship between the high-risk input variables (inlet air temperature X1, binder solution rate X3, and binder-to-powder ratio X5) and quality attributes (flowability Y1, temperature Y2, moisture content Y3, aggregation index Y4, and compactability Y5) of the process was investigated using response surface model (RSM), partial least squares method (PLS) and artificial neural network of multilayer perceptron (MLP). The morphological study of the granules was also investigated using a scanning electron microscope. The results showed that X1, X3, and X5 significantly affected the properties of granule. The RSM, PLS and MLP models were found to be useful statistical analysis tools for a better mechanistic understanding of granulation. The statistical analysis results showed that the RSM model had a better ability to fit the quality attributes of granules compared to the PLS and MLP models. Understanding the effect of process parameters on granule properties provides the basis for modulating the granulation parameters and optimizing the product performance at the early development stage of pharmaceutical products. PMID:28662115

  7. Exploring business process modelling paradigms and design-time to run-time transitions

    NASA Astrophysics Data System (ADS)

    Caron, Filip; Vanthienen, Jan

    2016-09-01

    The business process management literature describes a multitude of approaches (e.g. imperative, declarative or event-driven) that each result in a different mix of process flexibility, compliance, effectiveness and efficiency. Although the use of a single approach over the process lifecycle is often assumed, transitions between approaches at different phases in the process lifecycle may also be considered. This article explores several business process strategies by analysing the approaches at different phases in the process lifecycle as well as the various transitions.

  8. Introducing the "Decider" Design Process

    ERIC Educational Resources Information Center

    Prasa, Anthony R., Jr.; Del Guercio, Ryan

    2016-01-01

    Engineers are faced with solving important problems every day and must follow a step-by-step design process to arrive at solutions. Students who are taught an effective design process to apply to engineering projects begin to see problems as an engineer would, consider all ideas, and arrive at the best solution. Using an effective design process…

  9. Introducing the "Decider" Design Process

    ERIC Educational Resources Information Center

    Prasa, Anthony R., Jr.; Del Guercio, Ryan

    2016-01-01

    Engineers are faced with solving important problems every day and must follow a step-by-step design process to arrive at solutions. Students who are taught an effective design process to apply to engineering projects begin to see problems as an engineer would, consider all ideas, and arrive at the best solution. Using an effective design process…

  10. A Process for Design Engineering

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2004-01-01

    The American Institute of Aeronautics and Astronautics Design Engineering Technical Committee has developed a draft Design Engineering Process with the participation of the technical community. This paper reviews similar engineering activities, lays out common terms for the life cycle and proposes a Design Engineering Process.

  11. Rheological characterization and thermal modeling of polyolefins for process design and tailored interfaces

    NASA Astrophysics Data System (ADS)

    Jordan, Alex M.; Kim, Kyungtae; Bates, Frank S.; Macosko, Christopher W.; Jaffer, Shaffiq; Lhost, Olivier

    2017-05-01

    While chemically similar, it has long been known that polyethylene (PE) and polypropylene (PP) are immiscible and suffer poor interfacial adhesion when processed as layered films or blends. In this paper we present an examination of the effect that processing conditions, such as extrusion residence time and post-extrusion take-up, have on the interfacial adhesion between PE and PP. We show that transient heat transfer analysis and rheological measurement can be used to determine processing conditions, which maximize adhesion between immiscible polymer pairs.

  12. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  13. The Social Information Processing Model of Task Design: A Review of the Literature.

    DTIC Science & Technology

    1983-02-01

    DISTRIBUTION STATEMENT (of this feport) Approval for public release: distribution unlimited 17. DISTRIIBUTION STATEMENT Wo theo abstract .eterd In Block 0.It...Oldham, 1980). While authors in the task design field have generally avoided stat- Ing that measures are of objective jobs, Hackman, Oldham, Jansen

  14. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  15. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  16. Understanding the Process Model of Leadership: Follower Attribute Design and Assessment

    ERIC Educational Resources Information Center

    Antelo, Absael; Henderson, Richard L.; St. Clair, Norman

    2010-01-01

    Early leadership studies produced significant research findings that have helped differentiate between leader and follower personal attributes and their consequent behaviors (SEDL, 1992), but little attention was given to the follower's contribution to the leadership process. This study represents a continuation of research by Henderson, Antelo, &…

  17. Understanding the Process Model of Leadership: Follower Attribute Design and Assessment

    ERIC Educational Resources Information Center

    Antelo, Absael; Henderson, Richard L.; St. Clair, Norman

    2010-01-01

    Early leadership studies produced significant research findings that have helped differentiate between leader and follower personal attributes and their consequent behaviors (SEDL, 1992), but little attention was given to the follower's contribution to the leadership process. This study represents a continuation of research by Henderson, Antelo, &…

  18. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  19. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  20. Book Processing Facility Design.

    ERIC Educational Resources Information Center

    Sheahan (Drake)-Stewart Dougall, Marketing and Physical Distribution Consultants, New York, NY.

    The Association of New York Libraries for Technical Services (ANYLTS) is established to develop and run a centralized book processing facility for the public library systems in New York State. ANYLTS plans to receive book orders from the 22 library systems, transmit orders to publishers, receive the volumes from the publishers, print and attach…

  1. Model parameters conditioning on regional hydrologic signatures for process-based design flood estimation in ungauged basins.

    NASA Astrophysics Data System (ADS)

    Biondi, Daniela; De Luca, Davide Luciano

    2015-04-01

    The use of rainfall-runoff models represents an alternative to statistical approaches (such as at-site or regional flood frequency analysis) for design flood estimation, and constitutes an answer to the increasing need for synthetic design hydrographs (SDHs) associated to a specific return period. However, the lack of streamflow observations and the consequent high uncertainty associated with parameter estimation, usually pose serious limitations to the use of process-based approaches in ungauged catchments, which in contrast represent the majority in practical applications. This work presents the application of a Bayesian procedure that, for a predefined rainfall-runoff model, allows for the assessment of posterior parameters distribution, using the limited and uncertain information available for the response of an ungauged catchment (Bulygina et al. 2009; 2011). The use of regional estimates of river flow statistics, interpreted as hydrological signatures that measure theoretically relevant system process behaviours (Gupta et al. 2008), within this framework represents a valuable option and has shown significant developments in recent literature to constrain the plausible model response and to reduce the uncertainty in ungauged basins. In this study we rely on the first three L-moments of annual streamflow maxima, for which regressions are available from previous studies (Biondi et al. 2012; Laio et al. 2011). The methodology was carried out for a catchment located in southern Italy, and used within a Monte Carlo scheme (MCs) considering both event-based and continuous simulation approaches for design flood estimation. The applied procedure offers promising perspectives to perform model calibration and uncertainty analysis in ungauged basins; moreover, in the context of design flood estimation, process-based methods coupled with MCs approach have the advantage of providing simulated floods uncertainty analysis that represents an asset in risk-based decision

  2. Modeling of the flow continuum and optimal design of control-oriented injection systems in liquid composite molding processes

    NASA Astrophysics Data System (ADS)

    Gokce, Ali

    Several methodologies are presented in this dissertation that aim to ensure successful filling of the mold cavity consistently, during the mold filling stage of Liquid Composite Molding (LCM) processes such as Resin Transfer Molding (RTM), Vacuum Assisted Resin Transfer Molding (VARTM) and Seemann Composites Resin Infusion Molding (SCRIMP). Key parameters that affect the resin flow in the mold cavity can be divided into two main groups as continuum-related parameters and injection-related parameters. Flow continuum, which consists of all the spaces resin can reach in the mold cavity, has two major components: the porous medium, which is made up of the fiber reinforcements, and the flow channels that are introduced into the flow continuum unintentionally and offer an easy flow path to the resin. The properties that characterize the porous medium and the unintentional flow channels are continuum-related parameters. The injection-related parameters include resin injection locations (gates), resin injection conditions and air drainage locations (vents). Modeling the flow continuum is crucial in predicting the resin flow in the mold cavity. In this study, permeability, the key property of the porous medium, is predicted using the Method of Cells, a proven method to predict macroscopic properties of heterogeneous materials. Unintentional flow channels, which are also called racetracking channels, are modeled using a probabilistic approach. Injection-related parameters are the key tools to influence the resin flow in the mold cavity. In this study, Branch and Bound Search is modified for single gate optimization. Due to its pertinence to injection system design, the parameters that govern gate effectiveness in steering the resin advance are studied. A combinatorial search algorithm is proposed for vent optimization. Vent optimization and gate optimization algorithms are integrated for simultaneous gate and vent optimization. Overall, these methodologies reduce the cycle

  3. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  4. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  5. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  6. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  7. Computational design of the basic dynamical processes of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Lamb, V. R.

    1977-01-01

    The 12-layer UCLA general circulation model encompassing troposphere and stratosphere (and superjacent 'sponge layer') is described. Prognostic variables are: surface pressure, horizontal velocity, temperature, water vapor and ozone in each layer, planetary boundary layer (PBL) depth, temperature, moisture and momentum discontinuities at PBL top, ground temperature and water storage, and mass of snow on ground. Selection of space finite-difference schemes for homogeneous incompressible flow, with/without a free surface, nonlinear two-dimensional nondivergent flow, enstrophy conserving schemes, momentum advection schemes, vertical and horizontal difference schemes, and time differencing schemes are discussed.

  8. Computational design of the basic dynamical processes of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Lamb, V. R.

    1977-01-01

    The 12-layer UCLA general circulation model encompassing troposphere and stratosphere (and superjacent 'sponge layer') is described. Prognostic variables are: surface pressure, horizontal velocity, temperature, water vapor and ozone in each layer, planetary boundary layer (PBL) depth, temperature, moisture and momentum discontinuities at PBL top, ground temperature and water storage, and mass of snow on ground. Selection of space finite-difference schemes for homogeneous incompressible flow, with/without a free surface, nonlinear two-dimensional nondivergent flow, enstrophy conserving schemes, momentum advection schemes, vertical and horizontal difference schemes, and time differencing schemes are discussed.

  9. Process of inorganic nitrogen transformation and design of kinetics model in the biological aerated filter reactor.

    PubMed

    Yan, Gang; Xu, Xia; Yao, Lirong; Lu, Liqiao; Zhao, Tingting; Zhang, Wenyi

    2011-04-01

    As one of the plug-flow reactors, biological aerated filter (BAF) reactor was divided into four sampling sectors to understand the characteristics of elemental nitrogen transformation during the reaction process, and then the different characteristics of elemental nitrogen transformation caused by different NH(3)-N loadings, biological quantities and activities in each section were obtained. The results showed that the total transformation ratio in the nitrifying reactor was more than 90% in the absence of any organic carbon resource, at the same time, more than 65% NH(3)-N in the influent were nitrified at the filter height of 70 cm below under the conditions of the influent runoff 9-19 L/h, the gas-water ratio 4-5:1, the dissolved oxygen 3.0-5.8 mg/L and the NH(3)-N load 0.28-0.48 kg NH(3)-N/m(3) d. On the base of the Eckenfelder mode, the kinetics equation of the NH(3)-N transformation along the reactor was S(e)=S(0) exp(-0.0134D/L(1.2612)). Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  11. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  12. Biological neural networks as model systems for designing future parallel processing computers

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    One of the more interesting debates of the present day centers on whether human intelligence can be simulated by computer. The author works under the premise that neurons individually are not smart at all. Rather, they are physical units which are impinged upon continuously by other matter that influences the direction of voltage shifts across the units membranes. It is only the action of a great many neurons, billions in the case of the human nervous system, that intelligent behavior emerges. What is required to understand even the simplest neural system is painstaking analysis, bit by bit, of the architecture and the physiological functioning of its various parts. The biological neural network studied, the vestibular utricular and saccular maculas of the inner ear, are among the most simple of the mammalian neural networks to understand and model. While there is still a long way to go to understand even this most simple neural network in sufficient detail for extrapolation to computers and robots, a start was made. Moreover, the insights obtained and the technologies developed help advance the understanding of the more complex neural networks that underlie human intelligence.

  13. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  14. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  15. Reengineering the project design process

    NASA Astrophysics Data System (ADS)

    Kane Casani, E.; Metzger, Robert M.

    1995-01-01

    In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.

  16. Process simulation and design '94

    SciTech Connect

    Not Available

    1994-06-01

    This first-of-a-kind report describes today's process simulation and design technology for specific applications. It includes process names, diagrams, applications, descriptions, objectives, economics, installations, licensors, and a complete list of process submissions. Processes include: alkylation, aromatics extraction, catalytic reforming, cogeneration, dehydration, delayed coking, distillation, energy integration, catalytic cracking, gas sweetening, glycol/methanol injection, hydrocracking, NGL recovery and stabilization, solvent dewaxing, visbreaking. Equipment simulations include: amine plant, ammonia plant, heat exchangers, cooling water network, crude preheat train, crude unit, ethylene furnace, nitrogen rejection unit, refinery, sulfur plant, and VCM furnace. By-product processes include: olefins, polyethylene terephthalate, and styrene.

  17. Ligand modeling and design

    SciTech Connect

    Hay, B.P.

    1997-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used in the cost-effective removal of specific radionuclides from nuclear waste streams. Organic ligands with metal ion specificity are critical components in the development of solvent extraction and ion exchange processes that are highly selective for targeted radionuclides. The traditional approach to the development of such ligands involves lengthy programs of organic synthesis and testing, which in the absence of reliable methods for screening compounds before synthesis, results in wasted research effort. The author`s approach breaks down and simplifies this costly process with the aid of computer-based molecular modeling techniques. Commercial software for organic molecular modeling is being configured to examine the interactions between organic ligands and metal ions, yielding an inexpensive, commercially or readily available computational tool that can be used to predict the structures and energies of ligand-metal complexes. Users will be able to correlate the large body of existing experimental data on structure, solution binding affinity, and metal ion selectivity to develop structural design criteria. These criteria will provide a basis for selecting ligands that can be implemented in separations technologies through collaboration with other DOE national laboratories and private industry. The initial focus will be to select ether-based ligands that can be applied to the recovery and concentration of the alkali and alkaline earth metal ions including cesium, strontium, and radium.

  18. Volition Support Design Model

    ERIC Educational Resources Information Center

    Kim, ChanMin

    2013-01-01

    The purpose of this paper is to introduce a design model for supporting student volition. First, the construct of volition is explained and the importance of volition is further described in the context of goal attainment. Next, the theoretical basis of the model is described. Last, implications of the model are discussed for the design of…

  19. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  20. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  1. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference

  2. Applying 3-PG, a simple process-based model designed to produce practical results, to data from loblolly pine experiments

    Treesearch

    Joe J. Landsberg; Kurt H. Johnsen; Timothy J. Albaugh; H. Lee Allen; Steven E. McKeand

    2001-01-01

    3-PG is a simple process-based model that requires few parameter values and only readily available input data. We tested the structure of the model by calibrating it against loblolly pine data from the control treatment of the SETRES experiment in Scotland County, NC, then altered the fertility rating to simulate the effects of fertilization. There was excellent...

  3. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  4. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  5. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  6. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  7. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  8. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  9. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  10. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  11. Distributed Computing for Signal Processing: Modeling of Asynchronous Parallel Computation. Appendix G. On the Design and Modeling of Special Purpose Parallel Processing Systems.

    DTIC Science & Technology

    1985-05-01

    Data Corp., Cyber- Ikon Image Processing System Con- cepts., Digital Systems Division, Control Data Corp., Minneapolis, MN, Jan. 1977. [CDC77b] Control...Data Corp., Cyber- Ikon Flexible Processor Programming Textbook, Digital Systems Division, Control Data Corp., Minneap- olis, MN, Nov. 1977. [Che8O

  12. Optimal design activated sludge process by means of multi-objective optimization: case study in Benchmark Simulation Model 1 (BSM1).

    PubMed

    Chen, Wenliang; Yao, Chonghua; Lu, Xiwu

    2014-01-01

    Optimal design of activated sludge process (ASP) using multi-objective optimization was studied, and a benchmark process in Benchmark Simulation Model 1 (BSM1) was taken as a target process. The objectives of the study were to achieve four indexes of percentage of effluent violation (PEV), overall cost index (OCI), total volume and total suspended solids, making up four cases for comparative analysis. Models were solved by the non-dominated sorting genetic algorithm in MATLAB. Results show that: ineffective solutions can be rejected by adding constraints, and newly added objectives can affect the relationship between the existing objectives; taking Pareto solutions as process parameters, the performance indexes of PEV and OCI can be improved more than with the default process parameters of BSM1, especially for N removal and resistance against dynamic NH4(+)-N in influent. The results indicate that multi-objective optimization is a useful method for optimal design ASP.

  13. Designing cyclic universe models.

    PubMed

    Khoury, Justin; Steinhardt, Paul J; Turok, Neil

    2004-01-23

    The phenomenological constraints on the scalar field potential in cyclic models of the Universe are presented. We show that cyclic models require a comparable degree of tuning to that needed for inflationary models. The constraints are reduced to a set of simple design rules including "fast-roll" parameters analogous to the "slow-roll" parameters in inflation.

  14. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  15. Molecular modeling of directed self-assembly of block copolymers: Fundamental studies of processing conditions and evolutionary pattern design

    NASA Astrophysics Data System (ADS)

    Khaira, Gurdaman Singh

    Rapid progress in the semi-conductor industry has pushed for smaller feature sizes on integrated electronic circuits. Current photo-lithographic techniques for nanofabrication have reached their technical limit and are problematic when printing features small enough to meet future industrial requirements. "Bottom-up'' techniques, such as the directed self-assembly (DSA) of block copolymers (BCP), are the primary contenders to compliment current "top-down'' photo-lithography ones. For industrial requirements, the defect density from DSA needs to be less than 1 defect per 10 cm by 10 cm. Knowledge of both material synthesis and the thermodynamics of the self-assembly process are required before optimal operating conditions can be found to produce results adequate for industry. The work present in this thesis is divided into three chapters, each discussing various aspects of DSA as studied via a molecular model that contains the essential physics of BCP self-assembly. Though there are various types of guiding fields that can be used to direct BCPs over large wafer areas with minimum defects, this study focuses only on chemically patterned substrates. The first chapter addresses optimal pattern design by describing a framework where molecular simulations of various complexities are coupled with an advanced optimization technique to find a pattern that directs a target morphology. It demonstrates the first ever study where BCP self-assembly on a patterned substrate is optimized using a three-dimensional description of the block-copolymers. For problems pertaining to DSA, the methodology is shown to converge much faster than the traditional random search approach. The second chapter discusses the metrology of BCP thin films using TEM tomography and X-ray scattering techniques, such as CDSAXS and GISAXS. X-ray scattering has the advantage of being able to quickly probe the average structure of BCP morphologies over large wafer areas; however, deducing the BCP morphology

  16. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  17. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  18. Science Process Evaluation Model. Monograph.

    ERIC Educational Resources Information Center

    Small, Larry

    The goal of this monograph is to explain the evaluation program designed by Schaumburg Community Consolidated District 54, Schaumberg, Illinois. It discusses the process used in the development of the model, the product, the implication for classroom teachers and the effects of using an evaluation to assess science process skills. The process…

  19. Investigation on the Flexural Creep Stiffness Behavior of PC-ABS Material Processed by Fused Deposition Modeling Using Response Surface Definitive Screening Design

    NASA Astrophysics Data System (ADS)

    Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal

    2017-03-01

    The resistance of polymeric materials to time-dependent plastic deformation is an important requirement of the fused deposition modeling (FDM) design process, its processed products, and their application for long-term loading, durability, and reliability. The creep performance of the material and part processed by FDM is the fundamental criterion for many applications with strict dimensional stability requirements, including medical implants, electrical and electronic products, and various automotive applications. Herein, the effect of FDM fabrication conditions on the flexural creep stiffness behavior of polycarbonate-acrylonitrile-butadiene-styrene processed parts was investigated. A relatively new class of experimental design called "definitive screening design" was adopted for this investigation. The effects of process variables on flexural creep stiffness behavior were monitored, and the best suited quadratic polynomial model with high coefficient of determination ( R 2) value was developed. This study highlights the value of response surface definitive screening design in optimizing properties for the products and materials, and it demonstrates its role and potential application in material processing and additive manufacturing.

  20. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  1. Hotzone design and optimization for 2-in. AlN PVT growth process through global heat transfer modeling and simulations

    NASA Astrophysics Data System (ADS)

    Wang, Z. H.; Deng, X. L.; Cao, K.; Wang, J.; Wu, L.

    2017-09-01

    A tungsten based reactor to grow 2-in. PVT AlN crystals by induction heating was designed. In order to investigate the effect of the hotzone structure layout on the temperature distribution in the growth chamber, a series of global quasi-steady numerical simulations with and without gas convection was performed using the FEMAG software. Simulation results show that the temperature gradient between the AlN powder sources and the deposition interface is influenced profoundly by the size of the induction heater and the crucible thickness. Also the tungsten heat shields have obvious effects on the global temperature distribution and heater power consumption during the growth process. However, the number of tungsten shield layers plays a trivial role on the temperature gradient between the ALN powder sources and the crucible top. Global heat transfer simulations show that the designed hotzone can provide an optimized and flexible environment for 2-in. AlN PVT growth.

  2. Utilisation of Modeling, Stress Analysis, Kinematics Optimisation, and Hypothetical Estimation of Lifetime in the Design Process of Mobile Working Machines

    NASA Astrophysics Data System (ADS)

    Izrael, Gregor; Bukoveczky, Juraj; Gulan, Ladislav

    2011-12-01

    The contribution deals with several methods used in the construction process such as model creation, verification of technical parameters of the machine, and life estimation of the selected modules. Determination of life cycle for mobile working machines, and their carrying modules respectively by investigation and subsequent processing of results gained by service measurements. Machine life claimed by a producer is only relative, because life of these machines depends not only on the way of work on that particular machine but also the state of material which is manipulated by the machine and in great extent the operator, their observance of security regulations, and prescribed working conditions.

  3. Interactive graphics, the design process, and education

    SciTech Connect

    Norton, F.J.

    1980-09-01

    The field of design and drafting is changing continuously - its parameters are ever shifting and its applications are increasing. The use of Computer-Aided Design (CAD) and Computer-Aided Manufacturing (CAM) is becoming increasingly common in industry. However, instruction in CAD and CAM has in general not been incorporated into university curricula. This paper addresses the need for increased instruction in interactive graphics at the student level, and particularly in conjunction with the design process used by engineers, designers, and drafters. The development of three-dimensional graphical models using CAD is seen as a vital part of product development. Applications to printed circuit design and numerical control (NC) operations are discussed. Effective educational programs in the use of CAD must relate to designers, users, and managers and may be developed either by industry or academia. Possible approaches to new programs include coursework, projects involving CAD, and special collaborative efforts between industry and academic institutions. 1 figure.

  4. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  5. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  6. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  7. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  8. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  9. NANEX: Process design and optimization.

    PubMed

    Baumgartner, Ramona; Matić, Josip; Schrank, Simone; Laske, Stephan; Khinast, Johannes; Roblegg, Eva

    2016-06-15

    Previously, we introduced a one-step nano-extrusion (NANEX) process for transferring aqueous nano-suspensions into solid formulations directly in the liquid phase. Nano-suspensions were fed into molten polymers via a side-feeding device and excess water was eliminated via devolatilization. However, the drug content in nano-suspensions is restricted to 30 % (w/w), and obtaining sufficiently high drug loadings in the final formulation requires the processing of high water amounts and thus a fundamental process understanding. To this end, we investigated four polymers with different physicochemical characteristics (Kollidon(®) VA64, Eudragit(®) E PO, HPMCAS and PEG 20000) in terms of their maximum water uptake/removal capacity. Process parameters as throughput and screw speed were adapted and their effect on the mean residence time and filling degree was studied. Additionally, one-dimensional discretization modeling was performed to examine the complex interactions between the screw geometry and the process parameters during water addition/removal. It was established that polymers with a certain water miscibility/solubility can be manufactured via NANEX. Long residence times of the molten polymer in the extruder and low filling degrees in the degassing zone favored the addition/removal of significant amounts of water. The residual moisture content in the final extrudates was comparable to that of extrudates manufactured without water. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Exploring the linkage between cell culture process parameters and downstream processing utilizing a plackett-burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Chavez, Brittany K; Lute, Scott C; Read, Erik K; Rogstad, Sarah; Awotwe-Otoo, David; Brown, Matthew R; Boyne, Michael T; Brorson, Kurt A

    2017-01-01

    Linkage of upstream cell culture with downstream processing and purification is an aspect of Quality by Design crucial for efficient and consistent production of high quality biopharmaceutical proteins. In a previous Plackett-Burman screening study of parallel bioreactor cultures we evaluated main effects of 11 process variables, such as agitation, sparge rate, feeding regimens, dissolved oxygen set point, inoculation density, supplement addition, temperature, and pH shifts. In this follow-up study, we observed linkages between cell culture process parameters and downstream capture chromatography performance and subsequent antibody attributes. In depth analysis of the capture chromatography purification of harvested cell culture fluid yielded significant effects of upstream process parameters on host cell protein abundance and behavior. A variety of methods were used to characterize the antibody both after purification and buffer formulation. This analysis provided insight in to the significant impacts of upstream process parameters on aggregate formation, impurities, and protein structure. This report highlights the utility of linkage studies in identifying how changes in upstream parameters can impact downstream critical quality attributes. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:163-170, 2017.

  11. Application of central composite design and artificial neural network in modeling of reactive blue 21 dye removal by photo-ozonation process.

    PubMed

    Mehrizad, Ali; Gharbani, Parvin

    2016-01-01

    The present study deals with use of central composite design (CCD) and artificial neural network (ANN) in modeling and optimization of reactive blue 21 (RB21) removal from aqueous media under photo-ozonation process. Four effective operational parameters (including: initial concentration of RB21, O(3) concentration, UV light intensity and reaction time) were chosen and the experiments were designed by CCD based on response surface methodology (RSM). The obtained results from the CCD model were used in modeling the process by ANN. Under optimum condition (O(3) concentration of 3.95 mg L(-1), UV intensity of 20.5 W m(-2), reaction time of 7.77 min and initial dye concentration of 40.21 mg L(-1)), RB21 removal efficiency reached to up 98.88%. A topology of ANN with a three-layer consisting of four input neurons, 14 hidden neurons and one output neuron was designed. The relative significance of each major factor was calculated based on the connection weights of the ANN model. Dye and ozone concentrations were the most important variables in the photo-ozonation of RB21, followed by reaction time and UV light intensity. The comparison of predicted values by CCD and ANN with experimental results showed that both methods were highly efficient in the modeling of the process.

  12. Conceptual Chemical Process Design for Sustainability.

    EPA Science Inventory

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyse...

  13. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  14. Gaussian Process Morphable Models.

    PubMed

    Luthi, Marcel; Gerig, Thomas; Jud, Christoph; Vetter, Thomas

    2017-08-14

    Models of shape variations have become a central component for the automated analysis of images. An important class of shape models are point distribution models (PDMs). These models represent a class of shapes as a normal distribution of point variations, whose parameters are estimated from example shapes. Principal component analysis (PCA) is applied to obtain a low-dimensional representation of the shape variation in terms of the leading principal components. In this paper, we propose a generalization of PDMs, which we refer to as Gaussian Process Morphable Models (GPMMs). We model the shape variations with a Gaussian process, which we represent using the leading components of its Karhunen-Loève expansion. To compute the expansion, we make use of an approximation scheme based on the Nyström method. The resulting model can be seen as a continuous analog of a standard PDM. However, while for PDMs the shape variation is restricted to the linear span of the example data, with GPMMs we can define the shape variation using any Gaussian process. For example, we can build shape models that correspond to classical spline models and thus do not require any example data. Furthermore, Gaussian processes make it possible to combine different models. For example, a PDM can be extended with a spline model, to obtain a model that incorporates learned shape characteristics but is flexible enough to explain shapes that cannot be represented by the PDM.

  15. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  16. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  17. Hardware-software-co-design of parallel and distributed systems using a behavioural programming and multi-process model with high-level synthesis

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2011-05-01

    A new design methodology for parallel and distributed embedded systems is presented using the behavioural hardware compiler ConPro providing an imperative programming model based on concurrently communicating sequential processes (CSP) with an extensive set of interprocess-communication primitives and guarded atomic actions. The programming language and the compiler-based synthesis process enables the design of constrained power- and resourceaware embedded systems with pure Register-Transfer-Logic (RTL) efficiently mapped to FPGA and ASIC technologies. Concurrency is modelled explicitly on control- and datapath level. Additionally, concurrency on data-path level can be automatically explored and optimized by different schedulers. The CSP programming model can be synthesized to hardware (SoC) and software (C,ML) models and targets. A common source for both hardware and software implementation with identical functional behaviour is used. Processes and objects of the entire design can be distributed on different hardware and software platforms, for example, several FPGA components and software executed on several microprocessors, providing a parallel and distributed system. Intersystem-, interprocess-, and object communication is automatically implemented with serial links, not visible on programming level. The presented design methodology has the benefit of high modularity, freedom of choice of target technologies, and system architecture. Algorithms can be well matched to and distributed on different suitable execution platforms and implementation technologies, using a unique programming model, providing a balance of concurrency and resource complexity. An extended case study of a communication protocol used in high-density sensor-actuator networks should demonstrate and compare the design of a hardware and software target. The communication protocol is suited for high-density intra-and interchip networks.

  18. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 1: Flow aerodynamics in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-02-01

    Some results of the complex experimental and numerical study of aerodynamics and transfer processes in a vortex furnace, whose design was improved via the distributed tangential injection of fuel-air flows through the upper and lower burners, were presented. The experimental study of the aerodynamic characteristics of a spatial turbulent flow was performed on the isothermal laboratory model (at a scale of 1 : 20) of an improved vortex furnace using a laser Doppler measurement system. The comparison of experimental data with the results of the numerical modeling of an isothermal flow for the same laboratory furnace model demonstrated their agreement to be acceptable for engineering practice.

  19. The watershed-scale optimized and rearranged landscape design (WORLD) model and local biomass processing depots for sustainable biofuel production: Integrated life cycle assessments

    SciTech Connect

    Eranki, Pragnya L.; Manowitz, David H.; Bals, Bryan D.; Izaurralde, Roberto C.; Kim, Seungdo; Dale, Bruce E.

    2013-07-23

    An array of feedstock is being evaluated as potential raw material for cellulosic biofuel production. Thorough assessments are required in regional landscape settings before these feedstocks can be cultivated and sustainable management practices can be implemented. On the processing side, a potential solution to the logistical challenges of large biorefi neries is provided by a network of distributed processing facilities called local biomass processing depots. A large-scale cellulosic ethanol industry is likely to emerge soon in the United States. We have the opportunity to influence the sustainability of this emerging industry. The watershed-scale optimized and rearranged landscape design (WORLD) model estimates land allocations for different cellulosic feedstocks at biorefinery scale without displacing current animal nutrition requirements. This model also incorporates a network of the aforementioned depots. An integrated life cycle assessment is then conducted over the unified system of optimized feedstock production, processing, and associated transport operations to evaluate net energy yields (NEYs) and environmental impacts.

  20. SU-E-T-760: Tolerance Design for Site-Specific Range in Proton Patient QA Process Using the Six Sigma Model

    SciTech Connect

    Lah, J; Shin, D; Kim, G

    2015-06-15

    Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary to meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.

  1. Radiology interpretation process modeling.

    PubMed

    Noumeir, Rita

    2006-04-01

    Information and communication technology in healthcare promises optimized patient care while ensuring efficiency and cost-effectiveness. However, the promised results are not yet achieved; the healthcare process requires analysis and radical redesign to achieve improvements in care quality and productivity. Healthcare process reengineering is thus necessary and involves modeling its workflow. Even though the healthcare process is very large and not very well modeled yet, its sub-processes can be modeled individually, providing fundamental pieces of the whole model. In this paper, we are interested in modeling the radiology interpretation process that results in generating a diagnostic radiology report. This radiology report is an important clinical element of the patient healthcare record and assists in healthcare decisions. We present the radiology interpretation process by identifying its boundaries and by positioning it on the large healthcare process map. Moreover, we discuss an information data model and identify roles, tasks and several information flows. Furthermore, we describe standard frameworks to enable radiology interpretation workflow implementations between heterogeneous systems.

  2. Models of the Reading Process.

    PubMed

    Rayner, Keith; Reichle, Erik D

    2010-11-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a "model of reading" when talking about only one aspect of the reading process (for example, models of word identification are often referred to as "models of reading"). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers' eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized.

  3. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  4. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  5. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  6. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  7. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  8. Using an Analogical Thinking Model as an Instructional Tool to Improve Student Cognitive Ability in Architecture Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua

    2013-01-01

    Lack of creativity is a problem often plaguing students from design-related departments. Therefore, this study is intended to incorporate analogical thinking in the education of architecture design to enhance students' learning and their future career performance. First, this study explores the three aspects of architecture design curricula,…

  9. Using an Analogical Thinking Model as an Instructional Tool to Improve Student Cognitive Ability in Architecture Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua

    2013-01-01

    Lack of creativity is a problem often plaguing students from design-related departments. Therefore, this study is intended to incorporate analogical thinking in the education of architecture design to enhance students' learning and their future career performance. First, this study explores the three aspects of architecture design curricula,…

  10. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  11. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  12. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by

  13. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process.

  14. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  15. "From the Formal to the Innovative": The Use of Case Studies and Sustainable Projects in Developing a Design Process Model for Educating Product/Industrial Designers

    ERIC Educational Resources Information Center

    Oakes, G. L.; Felton, A. J.; Garner, K. B.

    2006-01-01

    The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…

  16. "From the Formal to the Innovative": The Use of Case Studies and Sustainable Projects in Developing a Design Process Model for Educating Product/Industrial Designers

    ERIC Educational Resources Information Center

    Oakes, G. L.; Felton, A. J.; Garner, K. B.

    2006-01-01

    The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…

  17. Balloon Thermal Model Design Parameters and Sensitivities

    NASA Technical Reports Server (NTRS)

    Ferguson, Douglas

    2017-01-01

    This presentation describes the thought process for determining balloon thermal model design parameters, including environmental parameters taken form NASA's top-of-atmosphere (TOA) database, and shows the sensitivity of an example model's key temperature results to those input parameters.

  18. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  19. Hafnium transistor process design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2009-01-01

    A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.

  20. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  1. An Integrative Model for Teaching Research Design.

    ERIC Educational Resources Information Center

    Packard, Richard D.; Dereshiwsky, Mary I.

    This paper presents a model which illustrates the cyclical and interactive nature of the basic elements of the research design process. Rather than presenting each research design component in isolation, the model emphasizes their interrelationships. A brief discussion is presented on each of the following components of the model: (1) the "words"…

  2. Gaps in the Design Process

    SciTech Connect

    Veers, Paul

    2016-10-04

    The design of offshore wind plants is a relatively new field. The move into U.S. waters will have unique environmental conditions, as well as expectations from the authorities responsible for managing the development. Wind turbines are required to test their assumed design conditions with the site conditions of the plant. There are still some outstanding issues on how we can assure that the design for both the turbine and the foundation are appropriate for the site and will have an acceptable level of risk associated with the particular installation.

  3. Launch Vehicle Design Process Characterization Enables Design/Project Tool

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Robinson, Nancy (Technical Monitor)

    2001-01-01

    The objectives of the project described in this viewgraph presentation included the following: (1) Provide an overview characterization of the launch vehicle design process; and (2) Delineate design/project tool to identify, document, and track pertinent data.

  4. Modeling Production Plant Forming Processes

    SciTech Connect

    Rhee, M; Becker, R; Couch, R; Li, M

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaboration with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.

  5. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  6. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  7. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  8. The 9-Step Problem Design Process for Problem-Based Learning: Application of the 3C3R Model

    ERIC Educational Resources Information Center

    Hung, Woei

    2009-01-01

    The design of problems is crucial for the effectiveness of problem-based learning (PBL). Research has shown that PBL problems have not always been effective. Ineffective PBL problems could affect whether students acquire sufficient domain knowledge, activate appropriate prior knowledge, and properly direct their own learning. This paper builds on…

  9. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  10. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  11. Three-dimensional model of the skull and the cranial bones reconstructed from CT scans designed for rapid prototyping process.

    PubMed

    Skrzat, Janusz; Spulber, Alexandru; Walocha, Jerzy

    This paper presents the effects of building mesh models of the human skull and the cranial bones from a series of CT-scans. With the aid of computer so ware, 3D reconstructions of the whole skull and segmented cranial bones were performed and visualized by surface rendering techniques. The article briefly discusses clinical and educational applications of 3D cranial models created using stereolitographic reproduction.

  12. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  13. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  14. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  15. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  16. A Model of Continuous Improvement in High Schools: A Process for Research, Innovation Design, Implementation, and Scale

    ERIC Educational Resources Information Center

    Cohen-Vogel, Lora; Cannata, Marisa; Rutledge, Stacey A.; Socol, Allison Rose

    2016-01-01

    This chapter describes a model for continuous improvement that guides the work of the National Center on Scaling Up Effective Schools, or NCSU. NCSU is a research and development center funded by the Institute for Education Sciences, the research arm of the United States Department of Education. At the core of the Center's work is an innovative…

  17. A Model of Continuous Improvement in High Schools: A Process for Research, Innovation Design, Implementation, and Scale

    ERIC Educational Resources Information Center

    Cohen-Vogel, Lora; Cannata, Marisa; Rutledge, Stacey A.; Socol, Allison Rose

    2016-01-01

    This chapter describes a model for continuous improvement that guides the work of the National Center on Scaling Up Effective Schools, or NCSU. NCSU is a research and development center funded by the Institute for Education Sciences, the research arm of the United States Department of Education. At the core of the Center's work is an innovative…

  18. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  19. Rapid Prototyping in the Instructional Design Process.

    ERIC Educational Resources Information Center

    Nixon, Elizabeth Krick; Lee, Doris

    2001-01-01

    Discusses instructional design models and examines rapid prototyping, a model that combines computer design strategies, constructivist learning theory, and cognitive psychology. Highlights include limitations of linear models; instructional problems appropriate and those not appropriate for rapid prototyping; and rapid prototyping as a paradigm…

  20. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  1. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  2. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  4. Advanced deformation process modeling

    SciTech Connect

    Kocks, U.F.; Embury, J.D.; Beaudoin, A.J.; Dawson, P.R.; MacEwen, S.R.; Mecking, H.J.

    1997-08-01

    Progress was made in achieving a comprehensive and coherent description of material behavior in deformation processing. The materials included were metals, alloys, intermetallic compounds, arbitrary lattice structure, and metal matrix composites. Aspects of behavior modeled included kinetics of flow and strain hardening, as well as recrystallization and the various anisotropies of strength and compliance. Highlights include a new prediction of the limiting strength of materials at high temperature, a new understanding of the generation of new grain boundaries during forming operations, and a quantitatively verified computer simulation of texture development and the resulting behavioral anisotropies.

  5. An Analysis of Algorithmic Processes and Instructional Design.

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Gerlach, Vernon S.

    1986-01-01

    Describes algorithms and shows how they can be applied to the design of instructional systems by relating them to a standard information processing model. Two studies are briefly described which tested serial and parallel processing in learning and offered guidelines for designers. Future research needs are also discussed. (LRW)

  6. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  7. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  8. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  9. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  10. Model for vaccine design by prediction of B-epitopes of IEDB given perturbations in peptide sequence, in vivo process, experimental techniques, and source or host organisms.

    PubMed

    González-Díaz, Humberto; Pérez-Montoto, Lázaro G; Ubeira, Florencio M

    2014-01-01

    Perturbation methods add variation terms to a known experimental solution of one problem to approach a solution for a related problem without known exact solution. One problem of this type in immunology is the prediction of the possible action of epitope of one peptide after a perturbation or variation in the structure of a known peptide and/or other boundary conditions (host organism, biological process, and experimental assay). However, to the best of our knowledge, there are no reports of general-purpose perturbation models to solve this problem. In a recent work, we introduced a new quantitative structure-property relationship theory for the study of perturbations in complex biomolecular systems. In this work, we developed the first model able to classify more than 200,000 cases of perturbations with accuracy, sensitivity, and specificity >90% both in training and validation series. The perturbations include structural changes in >50000 peptides determined in experimental assays with boundary conditions involving >500 source organisms, >50 host organisms, >10 biological process, and >30 experimental techniques. The model may be useful for the prediction of new epitopes or the optimization of known peptides towards computational vaccine design.

  11. Modular process modeling for OPC

    NASA Astrophysics Data System (ADS)

    Keck, M. C.; Bodendorf, C.; Schmidtling, T.; Schlief, R.; Wildfeuer, R.; Zumpe, S.; Niehoff, M.

    2007-03-01

    Modular OPC modeling, describing mask, optics, resist and etch processes separately is an approach to keep efforts for OPC manageable. By exchanging single modules of a modular OPC model, a fast response to process changes during process development is possible. At the same time efforts can be reduced, since only single modular process steps have to be re-characterized as input for OPC modeling as the process is adjusted and optimized. Commercially available OPC tools for full chip processing typically make use of semi-empirical models. The goal of our work is to investigate to what extent these OPC tools can be applied for modeling of single process steps as separate modules. For an advanced gate level process we analyze the modeling accuracy over different process conditions (focus and dose) when combining models for each process step - optics, resist and etch - for differing single processes to a model describing the total process.

  12. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  13. IMPLEMENTING THE SAFEGUARDS-BY-DESIGN PROCESS

    SciTech Connect

    Whitaker, J Michael; McGinnis, Brent; Laughter, Mark D; Morgan, Jim; Bjornard, Trond; Bean, Robert; Durst, Phillip; Hockert, John; DeMuth, Scott; Lockwood, Dunbar

    2010-01-01

    The Safeguards-by-Design (SBD) approach incorporates safeguards into the design and construction of nuclear facilities at the very beginning of the design process. It is a systematic and structured approach for fully integrating international and national safeguards for material control and accountability (MC&A), physical protection, and other proliferation barriers into the design and construction process for nuclear facilities. Implementing SBD is primarily a project management or project coordination challenge. This paper focuses specifically on the design process; the planning, definition, organization, coordination, scheduling and interaction of the safeguards experts and stakeholders as they participate in the design and construction of a nuclear facility. It delineates the steps in a nuclear facility design and construction project in order to provide the project context within which the safeguards design activities take place, describes the involvement of the safeguards experts in the design process, the nature of their analyses, interactions and decisions, and describes the documents created and how they are used. This report highlights the project context of safeguards activities, and identifies the safeguards community (nuclear facility operator, designer/builder, state regulator, SSAC and IAEA) must accomplish in order to implement SBD within the project.

  14. The Analytic Process Model for System Design and Measurement: A computer-Aided Tool for Analyzing Training Systems and other Human-Machine Systems

    DTIC Science & Technology

    1985-02-01

    performance measurement; effective- ness measurement; system populations; Bradley Infantry . Fighting Vehicle; BIFV; Analytic Process Model; APM...process model (APM) was developed from earlier models, applied in sample fashions to an existing system (the Bradley Infantry Fighting Vehicle) and...liradley Infantry Fighting Vehicle (Carrier Team Subsystem) 11 6. Example of a System Hierarchical Structure 14 7. Guidelines for Identifying

  15. Design Process Improvement for Electric CAR Harness

    NASA Astrophysics Data System (ADS)

    Sawatdee, Thiwarat; Chutima, Parames

    2017-06-01

    In an automobile parts design company, the customer satisfaction is one of the most important factors for product design. Therefore, the company employs all means to focus its product design process based on the various requirements of customers resulting in high number of design changes. The objective of this research is to improve the design process of the electric car harness that effects the production scheduling by using Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) as the main tools. FTA is employed for root cause analysis and FMEA is used to ranking a High Risk Priority Number (RPN) which is shows the priority of factors in the electric car harness that have high impact to the design of the electric car harness. After the implementation, the improvements are realized significantly since the number of design change is reduced from 0.26% to 0.08%.

  16. Ligand modeling and design

    SciTech Connect

    Hay, B.

    1996-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used tin applications for the cost-effective removal of specific radionuclides from nuclear waste streams.

  17. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  18. An Integrated Course and Design Project in Chemical Process Design.

    ERIC Educational Resources Information Center

    Rockstraw, David A.; And Others

    1997-01-01

    Describes a chemical engineering course curriculum on process design, analysis, and simulation. Includes information regarding the sequencing of engineering design classes and the location of the classes within the degree program at New Mexico State University. Details of course content are provided. (DDR)

  19. Design, processing, and testing of LSI arrays for space station

    NASA Technical Reports Server (NTRS)

    Ipri, A. C.

    1976-01-01

    The applicability of a particular process for the fabrication of large scale integrated circuits is described. Test arrays were designed, built, and tested, and then utilized. A set of optimum dimensions for LSI arrays was generated. The arrays were applied to yield improvement through process innovation, and additional applications were suggested in the areas of yield prediction, yield modeling, and process reliability.

  20. Computational developments for simulation-based design: Multi-disciplinary flow/thermal/cure/stress modeling, analysis, and validation for processing of composites

    NASA Astrophysics Data System (ADS)

    Ngo, Nam Duc

    In the process modeling and manufacturing of large geometrically complex structural components comprising of fiber-reinforced composite materials by Resin Transfer Molding (RTM), a polymer resin is injected into a mold cavity filled with porous fibrous preforms. The overall success of the manufacturing process depends on the complete impregnation of the fiber preform by the polymer resin, prevention of polymer gelation during filling, and subsequent avoidance of dry spots. Since the RTM process involves the injection of a cold resin into a heated mold, the associated physics encompasses a moving boundary value problem in conjunction with the multi-disciplinary study of flow/thermal/cure and the subsequent prediction of residual stresses inside the mold cavity. Although experimental validations are indispensable, routine manufacture of large complex structural geometries can only be enhanced via computational simulations; thus, eliminating costly trial runs and helping designers in the set-up of the manufacturing process. This thesis describes an in-depth study of the mathematical and computational developments towards formulating an effective simulation-based design methodology using the finite element method. The proposed methodology is well suited for applications to practical engineering structural components encountered in the manufacture of complex RTM type composites, and encompasses both thick and thin composites with the following distinguishing features: (i) an implicit pure finite element computational methodology with illustrations first to isothermal situations to overcome the deficiencies of traditional explicit type methods while permitting standard mesh generators to be employed in a straightforward manner; (ii) a methodology for predicting permeability of fiber preform microstructures in both virgin and manufactured states; (iii) extension of the implicit pure finite element methodology to non-isothermal situations with and without influence of

  1. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  2. On Design Mining: Coevolution and Surrogate Models.

    PubMed

    Preen, Richard J; Bull, Larry

    2017-01-01

    Design mining is the use of computational intelligence techniques to iteratively search and model the attribute space of physical objects evaluated directly through rapid prototyping to meet given objectives. It enables the exploitation of novel materials and processes without formal models or complex simulation. In this article, we focus upon the coevolutionary nature of the design process when it is decomposed into concurrent sub-design-threads due to the overall complexity of the task. Using an abstract, tunable model of coevolution, we consider strategies to sample subthread designs for whole-system testing and how best to construct and use surrogate models within the coevolutionary scenario. Drawing on our findings, we then describe the effective design of an array of six heterogeneous vertical-axis wind turbines.

  3. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    PubMed Central

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Background Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Results In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA) was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher) followed by a double one (clinician and managers of services) in the implementation phase. Conclusion The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements. PMID:20216954

  4. Solid model design simplification

    SciTech Connect

    Ames, A.L.; Rivera, J.J.; Webb, A.J.; Hensinger, D.M.

    1997-12-01

    This paper documents an investigation of approaches to improving the quality of Pro/Engineer-created solid model data for use by downstream applications. The investigation identified a number of sources of problems caused by deficiencies in Pro/Engineer`s geometric engine, and developed prototype software capable of detecting many of these problems and guiding users towards simplified, useable models. The prototype software was tested using Sandia production solid models, and provided significant leverage in attacking the simplification problem.

  5. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  6. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  7. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  8. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  9. Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara

    Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.

  10. User-Centered Design (UCD) Process Description

    DTIC Science & Technology

    2014-12-01

    TECHNICAL REPORT 2061 December 2014 User-Centered Design (UCD) Process Description Michael Cowen Alan Lemon Deborah...CA. Available online at http://www.dodccrp.org/events/15th_iccrts_2010/papers/033.pdf. Accessed 11/25/2014. 2. A. G. Lemon and M. B. Cowen. 2012. “A...Prescribed by ANSI Std. Z39.18 Decemb December 2014 Final User-Centered Design (USD) Process Description Michael Cowen Alan Lemon

  11. The process road between requirements and design

    SciTech Connect

    Goedicke, M.; Nuseibeh, B.

    1996-12-31

    The software engineering literature contains many examples of methods, tools and techniques that claim to facilitate a variety of requirements engineering and design activities. Guidance on how these activities are related within a coherent software development process is much less apparent. A central problem that makes such guidance difficult to achieve is that requirements engineering addresses problem domains whereas design addresses solution domains. This is in the face of frequent changes in requirements contrasted with the need for stable design solutions.

  12. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  13. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  14. Evaluation and Modeling of Vapor-Liquid Equilibrium and CO2 Absorption Enthalpies of Aqueous Designer Diamines for Post Combustion Capture Processes.

    PubMed

    Luo, Weiliang; Yang, Qi; Conway, William; Puxty, Graeme; Feron, Paul; Chen, Jian

    2017-06-20

    Novel absorbents with improved characteristics are required to reduce the existing cost and environmental barriers to deployment of large scale CO2 capture. Recently, bespoke absorbent molecules have been specifically designed for CO2 capture applications, and their fundamental properties and suitability for CO2 capture processes evaluated. From the study, two unique diamine molecules, 4-(2-hydroxyethylamino)piperidine (A4) and 1-(2-hydroxyethyl)-4-aminopiperidine (C4), were selected for further evaluation including thermodynamic characterization. The solubilities of CO2 in two diamine solutions with a mass fraction of 15% and 30% were measured at different temperatures (313.15-393.15 K) and CO2 partial pressures (up to 400 kPa) by thermostatic vapor-liquid equilibrium (VLE) stirred cell. The absorption enthalpies of reactions between diamines and CO2 were evaluated at different temperatures (313.15 and 333.15 K) using a CPA201 reaction calorimeter. The amine protonation constants and associated protonation enthalpies were determined by potentiometric titration. The interaction of CO2 with the diamine solutions was summarized and a simple mathematical model established that could make a preliminary but good prediction of the VLE and thermodynamic properties. Based on the analyses in this work, the two designer diamines A4 and C4 showed superior performance compared to amines typically used for CO2 capture and further research will be completed at larger scale.

  15. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  16. Models Help Teach Undergraduate Design.

    ERIC Educational Resources Information Center

    Hills, Peter

    1984-01-01

    The design and construction of models forms the foundation of first-year design teaching (totaling 18 class hours) in the three-year mechanical engineering program at the Royal Military College of Science. Lists the aims of this approach, providing examples of the types of models produced by students while solving engineering problems. (JN)

  17. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  18. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  19. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  20. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  1. Model based design introduction: modeling game controllers to microprocessor architectures

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  2. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  3. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    of forecasts produced by US Army Research Laboratory’s nowcast model, Weather Running Estimate-Nowcast (WRE- N ). This report documents the design and...implementation of the automated process of generating domain-level error statistics that can be used by modelers to improve the accuracy of WRE- N model

  4. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  5. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  6. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  7. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  8. Process-based design of dynamical biological systems

    PubMed Central

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-01-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered. PMID:27686219

  9. Process-based design of dynamical biological systems

    NASA Astrophysics Data System (ADS)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  10. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  11. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  12. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  13. Validating instrument models through the calibration process

    NASA Astrophysics Data System (ADS)

    Bingham, G. E.; Tansock, J. J.

    2006-08-01

    The performance of modern IR instruments is becoming so good that meeting science requirements requires an accurate instrument model be used throughout the design and development process. The huge cost overruns on recent major programs are indicative that the design and cost models being used to predict performance have lagged behind anticipated performance. Tuning these models to accurately reflect the true performance of target instruments requires a modeling process that has been developed over several instruments and validated by careful calibration. The process of developing a series of Engineering Development Models is often used on longer duration programs to achieve this end. The accuracy of the models and their components has to be validated by a carefully planned calibration process, preferably considered in the instrument design. However, a good model does not satisfy all the requirements to bring acquisition programs under control. Careful detail in the specification process and a similar, validated model on the government side will also be required. This paper discusses the model development process and calibration approaches used to verify and update the models of several new instruments, including Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and Far Infrared Spectroscopy of the Troposphere (FIRST).

  14. The Ecosystem Model: Designing Campus Environments.

    ERIC Educational Resources Information Center

    Western Interstate Commission for Higher Education, Boulder, CO.

    This document stresses the increasing awareness in higher education of the impact student/environment transactions have upon the quality of educational life and details a model and design process for creating a better fit between educational environments and students. The ecosystem model uses an interdisciplinary approach for the make-up of its…

  15. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi; Tung, Yuanki

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  16. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi ); Tung, Yuanki )

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  17. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  18. Conceptual Chemical Process Design for Sustainability. ...

    EPA Pesticide Factsheets

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews

  19. Designing Competitive Service Models

    NASA Astrophysics Data System (ADS)

    Martinez, Veronica; Turner, Trevor

    The explosives developed in Europe in the late nineteenth and early twentieth ­century by the famous Swede and patron of the world peace prize, Alfred Nobel, were extremely durable and, apart from the introduction of the electric detonator, have remained in use with minor modifications for almost a century (Fig. 5.1a). In the 1970s a new invention started a process of change that has transformed the explosives business from being a supplier of products to a provider of a service. Survival very much depended on the agility of ICI Explosives UK, hereinafter referred to as "ICI Explosives," in adapting to the new competitive environment. Manufacturing excellence was not a solution. Innovative thinking was required to sustain the ­business as changes in technology reduced the complexity that had ­protected the business from serious competition for over a century.

  20. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  1. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  2. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  3. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.

  4. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  5. Ensuring competitive advantage with semantic design process management

    SciTech Connect

    Quazzani, A.; Bernard, A.; Bocquet, J.C.

    1996-12-31

    In the field of design assistance, it is important to improve records of design history and management of design process. Indeed, we propose a modelling approach of design process that focuses on representation of semantic actions. We have identified two types of actions: physical design actions focusing on the product (e.g., parameter creation, shaft dimensioning) and management actions that allow management of the process from the planning and control viewpoint (e.g., synchronization actions, a resource allocation for a task). A taxonomy of these actions has been established according to several criteria (granularity, fields of action ... ) selected in consideration of our process management interests. Linkage with objective and rationale is also discussed.

  6. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    PubMed

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Data processing boards design for CBM experiment

    NASA Astrophysics Data System (ADS)

    Zabołotny, Wojciech M.; Kasprowicz, Grzegorz

    2014-11-01

    This paper presents a concept of the Data Processing Boards for the Compressed Baryonic Matter (CBM) experiment. Described is the evolution of the concepts leading from the functional requirements of the control and readout systems of the CBM experiment to the design of prototype implementation of the DPB boards. The paper describes requirements on the board level and on the crate level. Finally it discusses the prototype design prepared for testing and verification of proposed solutions, and selection of the final implementation.

  8. Automation of the aircraft design process

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1974-01-01

    The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.

  9. Design, control and in situ visualization of gas nitriding processes.

    PubMed

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process.

  10. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  11. Conceptual design of industrial process displays.

    PubMed

    Pedersen, C R; Lind, M

    1999-11-01

    Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is

  12. Electromagnetic modeling in accelerator designs

    SciTech Connect

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described.

  13. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  14. Process of system design and analysis

    SciTech Connect

    Gardner, B.

    1995-09-01

    The design of an effective physical protection system includes the determination of the physical protection system objectives, the initial design of a physical protection system, the evaluation of the design, and, probably, a redesign or refinement of the system. To develop the objectives, the designer must begin by gathering information about facility operations and conditions, such as a comprehensive description of the facility, operating states, and the physical protection requirements. The designer then needs to define the threat. This involves considering factors about potential adversaries: Class of adversary, adversary`s capabilities, and range of adversary`s tactics. Next, the designer should identify targets. Determination of whether or not nuclear materials are attractive targets is based mainly on the ease or difficulty of acquisition and desirability of the materiaL The designer now knows the objectives of the physical protection system, that is, ``What to protect against whom.`` The next step is to design the system by determining how best to combine such elements as fences, vaults, sensors, procedures, communication devices, and protective force personnel to meet the objectives of the system. Once a physical protection system is designed, it must be analyzed and evaluated to ensure it meets the physical protection objectives. Evaluation must allow for features working together to assure protection rather than regarding each feature separately. Due to the complexity of protection systems, an evaluation usually requires modeling techniques. If any vulnerabilities are found, the initial system must be redesigned to correct the vulnerabilities and a reevaluation conducted.

  15. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  16. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  17. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product design…

  18. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  19. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  20. Molecular thermodynamics for chemical process design.

    PubMed

    Prausnitz, J M

    1979-08-24

    Chemical process design requires quantitative information on the equilibrium properties of a variety of fluid mixtures. Since the experimental effort needed to provide this information is often prohibitive in cost and time, chemical engineers must utilize rational estimation techniques based on limited experimental data. The basis for such techniques is molecular thermodynamics, a synthesis of classical and statistical thermodynamics, molecular physics, and physical chemistry.

  1. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product design…

  2. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  3. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  4. Modeling Primary Atomization Processes

    DTIC Science & Technology

    2007-11-02

    I., "Generation of Ripples by Wind Blowing Over a Viscous Fluid", The Scientific Papers of Sir Geoffrey Ingram Taylor, 1963. 2. A. A. Amsden, P. J...92, 1983. 28. Jin, Xiaoshi, "Boundary Element Study on Particle Orientation Caused by the Fountain Flow in Injection Molding ", Polymer Engineering...HTPB, PE is a thermoplastic which is commonly produced via extrusion from a die in a continuous process. Hence, PE grains could be produced using

  5. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  6. Designing Instruction That Supports Cognitive Learning Processes

    PubMed Central

    Clark, Ruth; Harrelson, Gary L.

    2002-01-01

    Objective: To provide an overview of current cognitive learning processes, including a summary of research that supports the use of specific instructional methods to foster those processes. We have developed examples in athletic training education to help illustrate these methods where appropriate. Data Sources: Sources used to compile this information included knowledge base and oral and didactic presentations. Data Synthesis: Research in educational psychology within the past 15 years has provided many principles for designing instruction that mediates the cognitive processes of learning. These include attention, management of cognitive load, rehearsal in working memory, and retrieval of new knowledge from long-term memory. By organizing instruction in the context of tasks performed by athletic trainers, transfer of learning and learner motivation are enhanced. Conclusions/Recommendations: Scientific evidence supports instructional methods that can be incorporated into lesson design and improve learning by managing cognitive load in working memory, stimulating encoding into long-term memory, and supporting transfer of learning. PMID:12937537

  7. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  8. Design Concept Evaluation Using System Throughput Model

    SciTech Connect

    G. Sequeira; W. M. Nutt Ph.D

    2004-05-28

    The U.S. Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is currently developing the technical bases to support the submittal of a license application for construction of a geologic repository at Yucca Mountain, Nevada to the U.S. Nuclear Regulatory Commission. The Office of Repository Development (ORD) is responsible for developing the design of the proposed repository surface facilities for the handling of spent nuclear fuel and high level nuclear waste. Preliminary design activities are underway to sufficiently develop the repository surface facilities design for inclusion in the license application. The design continues to evolve to meet mission needs and to satisfy both regulatory and program requirements. A system engineering approach is being used in the design process since the proposed repository facilities are dynamically linked by a series of sub-systems and complex operations. In addition, the proposed repository facility is a major system element of the overall waste management process being developed by the OCRWM. Such an approach includes iterative probabilistic dynamic simulation as an integral part of the design evolution process. A dynamic simulation tool helps to determine if: (1) the mission and design requirements are complete, robust, and well integrated; (2) the design solutions under development meet the design requirements and mission goals; (3) opportunities exist where the system can be improved and/or optimized; and (4) proposed changes to the mission, and design requirements have a positive or negative impact on overall system performance and if design changes may be necessary to satisfy these changes. This paper will discuss the type of simulation employed to model the waste handling operations. It will then discuss the process being used to develop the Yucca Mountain surface facilities model. The latest simulation model and the results of the simulation and how the data were used in the design

  9. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  10. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  11. Protein designs in HP models

    NASA Astrophysics Data System (ADS)

    Gupta, Arvind; Khodabakhshi, Alireza Hadj; Maňuch, Ján; Rafiey, Arash; Stacho, Ladislav

    2009-07-01

    The inverse protein folding problem is that of designing an amino acid sequence which folds into a prescribed shape. This problem arises in drug design where a particular structure is necessary to ensure proper protein-protein interactions and could have applications in nanotechnology. A major challenge in designing proteins with native folds that attain a specific shape is to avoid proteins that have multiple native folds (unstable proteins). In this technical note we present our results on protein designs in the variant of Hydrophobic-Polar (HP) model introduced by Dill [6] on 2D square lattice. The HP model distinguishes only polar and hydrophobic monomers and only counts the number of hydrophobic contacts in the energy function. To achieve better stability of our designs we use the Hydrophobic-Polar-Cysteine (HPC) model which distinguishes the third type of monomers called "cysteines" and incorporates also the disulfid bridges (SS-bridges) into the energy function. We present stable designs in 2D square lattice and 3D hexagonal prism lattice in the HPC model.

  12. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  13. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  14. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  15. The design of a nanolithographic process

    NASA Astrophysics Data System (ADS)

    Johannes, Matthew Steven

    This research delineates the design of a nanolithographic process for nanometer scale surface patterning. The process involves the combination of serial atomic force microscope (AFM) based nanolithography with the parallel patterning capabilities of soft lithography. The union of these two techniques provides for a unique approach to nanoscale patterning that establishes a research knowledge base and tools for future research and prototyping. To successfully design this process a number of separate research investigations were undertaken. A custom 3-axis AFM with feedback control on three positioning axes of nanometer precision was designed in order to execute nanolithographic research. This AFM system integrates a computer aided design/computer aided manufacturing (CAD/CAM) environment to allow for the direct synthesis of nanostructures and patterns using a virtual design interface. This AFM instrument was leveraged primarily to study anodization nanolithography (ANL), a nanoscale patterning technique used to generate local surface oxide layers on metals and semiconductors. Defining research focused on the automated generation of complex oxide nanoscale patterns as directed by CAD/CAM design as well as the implementation of tip-sample current feedback control during ANL to increase oxide uniformity. Concurrently, research was conducted concerning soft lithography, primarily in microcontact printing (muCP), and pertinent experimental and analytic techniques and procedures were investigated. Due to the masking abilities of the resulting oxide patterns from ANL, the results of AFM based patterning experiments are coupled with micromachining techniques to create higher aspect ratio structures at the nanoscale. These relief structures are used as master pattern molds for polymeric stamp formation to reproduce the original in a parallel fashion using muCP stamp formation and patterning. This new method of master fabrication provides for a useful alternative to

  16. From Business Value Model to Coordination Process Model

    NASA Astrophysics Data System (ADS)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  17. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Computing confidence intervals for point process models.

    PubMed

    Sarma, Sridevi V; Nguyen, David P; Czanner, Gabriela; Wirth, Sylvia; Wilson, Matthew A; Suzuki, Wendy; Brown, Emery N

    2011-11-01

    Characterizing neural spiking activity as a function of intrinsic and extrinsic factors is important in neuroscience. Point process models are valuable for capturing such information; however, the process of fully applying these models is not always obvious. A complete model application has four broad steps: specification of the model, estimation of model parameters given observed data, verification of the model using goodness of fit, and characterization of the model using confidence bounds. Of these steps, only the first three have been applied widely in the literature, suggesting the need to dedicate a discussion to how the time-rescaling theorem, in combination with parametric bootstrap sampling, can be generally used to compute confidence bounds of point process models. In our first example, we use a generalized linear model of spiking propensity to demonstrate that confidence bounds derived from bootstrap simulations are consistent with those computed from closed-form analytic solutions. In our second example, we consider an adaptive point process model of hippocampal place field plasticity for which no analytical confidence bounds can be derived. We demonstrate how to simulate bootstrap samples from adaptive point process models, how to use these samples to generate confidence bounds, and how to statistically test the hypothesis that neural representations at two time points are significantly different. These examples have been designed as useful guides for performing scientific inference based on point process models.

  19. Chemical Process Modeling and Control.

    ERIC Educational Resources Information Center

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  20. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  1. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  2. Non-Linear Instructional Design Model: Eternal, Synergistic Design and Development

    ERIC Educational Resources Information Center

    Crawford, Caroline

    2004-01-01

    Instructional design is at the heart of each educational endeavour. This process revolves around the steps through which the thoughtful productions of superior products are created. The ADDIE generic instructional design model emphasises five basic steps within the instructional design process: analyse, design, develop, implement and evaluate. The…

  3. Plant design: Integrating Plant and Equipment Models

    SciTech Connect

    Sloan, David; Fiveland, Woody; Zitney, S.E.; Osawe, Maxwell

    2007-08-01

    Like power plant engineers, process plant engineers must design generating units to operate efficiently, cleanly, and profitably despite fluctuating costs for raw materials and fuels. To do so, they increasingly create virtual plants to enable evaluation of design concepts without the expense of building pilot-scale or demonstration facilities. Existing computational models describe an entire plant either as a network of simplified equipment models or as a single, very detailed equipment model. The Advanced Process Engineering Co-Simulator (APECS) project (Figure 5) sponsored by the U.S. Department of Energy's National Energy Technology Laboratory (NETL) seeks to bridge the gap between models by integrating plant modeling and equipment modeling software. The goal of the effort is to provide greater insight into the performance of proposed plant designs. The software integration was done using the process-industry standard CAPE-OPEN (Computer Aided Process Engineering–Open), or CO interface. Several demonstration cases based on operating power plants confirm the viability of this co-simulation approach.

  4. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  5. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  6. Optimization of Forming Processes in Microstructure Sensitive Design

    NASA Astrophysics Data System (ADS)

    Garmestani, H.; Li, D. S.

    2004-06-01

    Optimization of the forming processes from initial microstructures of raw materials to desired microstructures of final products is an important topic in materials design. Processing path model proposed in this study gives an explicit mathematical solution about how the microstructure evolves during thermomechanical processing. Based on a conservation principle in the orientation space (originally proposed by Bunge), this methodology is independent of the underlying deformation mechanisms. The evolutions of texture coefficients are modeled using a texture evolution matrix calculated from the experimental results. For the same material using the same processing method, the texture evolution matrix is the same. It does not change with the initial texture. This processing path model provides functions of processing paths and streamlines.

  7. Mimicry of natural material designs and processes

    NASA Astrophysics Data System (ADS)

    Bond, G. M.; Richman, R. H.; McNaughton, W. P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  8. Mimicry of natural material designs and processes

    SciTech Connect

    Bond, G.M.; Richman, R.H.; McNaughton, W.P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  9. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  10. Generic Model Host System Design

    SciTech Connect

    Chu, Chungming; Wu, Juhao; Qiang, Ji; Shen, Guobao; /Brookhaven

    2012-06-22

    There are many simulation codes for accelerator modelling; each one has some strength but not all. A platform which can host multiple modelling tools would be ideal for various purposes. The model platform along with infrastructure support can be used not only for online applications but also for offline purposes. Collaboration is formed for the effort of providing such a platform. In order to achieve such a platform, a set of common physics data structure has to be set. Application Programming Interface (API) for physics applications should also be defined within a model data provider. A preliminary platform design and prototype is discussed.

  11. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  12. The Processes Involved in Designing Software.

    DTIC Science & Technology

    1980-08-01

    body of relevant knowledge. There has been a limited amount of research on the process of design or on problems that are difficult enough to require the...refinement of those subproblems. Our results are therefore potentially limited to similar straightforward problems. In tasks for which the...They first break the problem Into Its major constituents, thus forming a solution moodl . During each Iteration, subproblems from the previous cycle are

  13. Thinking and the Design Process. DIUL-RR-8414.

    ERIC Educational Resources Information Center

    Moulin, Bernard

    Designed to focus attention on the design process in such computer science activities as information systems design, database design, and expert systems design, this paper examines three main phases of the design process: understanding the context of the problem, identifying the problem, and finding a solution. The processes that these phases…

  14. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  15. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  16. Design flow for implementing image processing in FPGAs

    NASA Astrophysics Data System (ADS)

    Trakalo, M.; Giles, G.

    2007-04-01

    A design flow for implementing a dynamic gamma algorithm in an FPGA is described. Real-time video processing makes enormous demands on processing resources. An FPGA solution offers some advantages over commercial video chip and DSP implementation alternatives. The traditional approach to FPGA development involves a system engineer designing, modeling and verifying an algorithm and writing a specification. A hardware engineer uses the specification as a basis for coding in VHDL and testing the algorithm in the FPGA with supporting electronics. This process is work intensive and the verification of the image processing algorithm executing on the FPGA does not occur until late in the program. The described design process allows the system engineer to design and verify a true VHDL version of the algorithm, executing in an FPGA. This process yields reduced risk and development time. The process is achieved by using Xilinx System Generator in conjunction with Simulink® from The MathWorks. System Generator is a tool that bridges the gap between the high level modeling environment and the digital world of the FPGA. System Generator is used to develop the dynamic gamma algorithm for the contrast enhancement of a candidate display product. The results of this effort are to increase the dynamic range of the displayed video, resulting in a more useful image for the user.

  17. Manufacturing process design for multi commodities in agriculture

    NASA Astrophysics Data System (ADS)

    Prasetyawan, Yudha; Santosa, Andrian Henry

    2017-06-01

    High-potential commodities within particular agricultural sectors should be accompanied by maximum benefit value that can be attained by both local farmers and business players. In several cases, the business players are small-medium enterprises (SMEs) which have limited resources to perform added value process of the local commodities into the potential products. The weaknesses of SMEs such as the manual production process with low productivity, limited capacity to maintain prices, and unattractive packaging due to conventional production. Agricultural commodity is commonly created into several products such as flour, chips, crackers, oil, juice, and other products. This research was initiated by collecting data by interview method particularly to obtain the perspectives of SMEs as the business players. Subsequently, the information was processed based on the Quality Function Deployment (QFD) to determine House of Quality from the first to fourth level. A proposed design as the result of QFD was produced and evaluated with Technology Assessment Model (TAM) and continued with a revised design. Finally, the revised design was analyzed with financial perspective to obtain the cost structure of investment, operational, maintenance, and workers. The machine that performs manufacturing process, as the result of revised design, was prototyped and tested to determined initial production process. The designed manufacturing process offers IDR 337,897, 651 of Net Present Value (NPV) in comparison with the existing process value of IDR 9,491,522 based on similar production input.

  18. Economic design of control charts considering process shift distributions

    NASA Astrophysics Data System (ADS)

    Vommi, Vijayababu; Kasarapu, Rukmini V.

    2014-09-01

    Process shift is an important input parameter in the economic design of control charts. Earlier control chart designs considered constant shifts to occur in the mean of the process for a given assignable cause. This assumption has been criticized by many researchers since it may not be realistic to produce a constant shift whenever an assignable cause occurs. To overcome this difficulty, in the present work, a distribution for the shift parameter has been considered instead of a single value for a given assignable cause. Duncan's economic design model for chart has been extended to incorporate the distribution for the process shift parameter. It is proposed to minimize total expected loss-cost to obtain the control chart parameters. Further, three types of process shifts namely, positively skewed, uniform and negatively skewed distributions are considered and the situations where it is appropriate to use the suggested methodology are recommended.

  19. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  20. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  1. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  2. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  3. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  4. A formulation of metamodel implementation processes for complex systems design

    NASA Astrophysics Data System (ADS)

    Daberkow, Debora Daniela

    Complex systems design poses an interesting as well as demanding information management problem for system level integration and design. The high interconnectivity of disciplines combined with the specific knowledge and expertise in each of these calls for a system level view that is broad, as in spanning across all disciplines, while at the same time detailed enough to do the disciplinary knowledge justice. The treatment of this requires highly evolved information management and decision approaches, which result in design methodologies that can handle this high degree of complexity. The solution is to create models within the design process, which predict meaningful metrics representative of the various disciplinary analyses that can be quickly evaluated and thus serve in system level decision making and optimization. Such models approximate the physics-based analysis codes used in each of the disciplines and are called metamodels since effectively, they model the (physics-based) models on which the disciplinary analysis codes are based. The thesis formulates a new metamodel implementation process to be used in complex systems design, utilizing a Gaussian Process prediction method. It is based on a Bayesian probability and inference approach and as such returns a variance prediction along with the most likely value, thus giving an estimate also for the confidence in the prediction. Within this thesis, the applicability and appropriateness at the theoretical as well as practical level are investigated, and proof-of-concept implementations at the disciplinary and system levels are provided.

  5. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  6. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  7. MODEL OF DIFFUSERS / PERMEATORS FOR HYDROGEN PROCESSING

    SciTech Connect

    Hang, T; William Jacobs, W

    2007-08-27

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper.

  8. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  9. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  10. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  11. Designing and encoding models for synthetic biology

    PubMed Central

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-01-01

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology ‘loop’. PMID:19364720

  12. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  13. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  14. Inventory Reduction Using Business Process Reengineering and Simulation Modeling.

    DTIC Science & Technology

    1996-12-01

    center is analyzed using simulation modeling and business process reengineering (BPR) concepts. The two simulation models were designed and evaluated by...reengineering and simulation modeling offer powerful tools to aid the manager in reducing cycle time and inventory levels.

  15. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  16. Process models as tools in forestry research and management

    Treesearch

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  17. Conceptual Fuselage Design with Direct CAD Modeling

    NASA Astrophysics Data System (ADS)

    Anderson, Benjamin K.

    In today's day and age, the use of automated technology is becoming increasingly prevalent. Throughout the aerospace industry, we see the use of automated systems in manufacturing, testing, and, progressively, in design. This thesis focuses on the idea of automated structural design that can be directly coupled with parametric Computer-Aided Drafting (CAD) and used to support aircraft conceptual design. This idea has been around for many years; however, with the advancement of CAD technology, it is becoming more realistic. Having the ability to input design parameters, analyze the structure, and produce a basic CAD model not only saves time in the design process but provides an excellent platform to communicate ideas. The user has the ability to change parameters and quickly determine the effect on the structure. Coupling this idea with automated parametric CAD provides visual verification and a platform to export into Finite Element Analysis (FEA) for further verification.

  18. Innovative machine designs for radiation processing

    NASA Astrophysics Data System (ADS)

    Vroom, David

    2007-12-01

    In the 1990s Raychem Corporation established a program to investigate the commercialization of several promising applications involving the combined use of its core competencies in materials science, radiation chemistry and e-beam radiation technology. The applications investigated included those that would extend Raychem's well known heat recoverable polymer and wire and cable product lines as well as new potential applications such as remediation of contaminated aqueous streams. A central part of the program was the development of new accelerator technology designed to improve quality, lower processing costs and efficiently process conformable materials such at liquids. A major emphasis with this new irradiation technology was to look at the accelerator and product handling systems as one integrated, not as two complimentary systems.

  19. Modeling of the interface formation during CuO deposition on Al(111) substrate: linking material design and elaboration process parameters through multi-levels approach

    NASA Astrophysics Data System (ADS)

    Guiltat, M.; Salles, N.; Brut, M.; Landa, G.; Richard, N.; Vizzini, S.; Hémeryck, A.

    2017-09-01

    In this paper, we use a multi-levels modeling approach to describe the elaboration of directly integrated energetic materials. The deposition of copper oxide on aluminum substrate is described. Atomic scale calculations are first conducted to identify local mechanisms involved during the growth of CuO on Al(111). These atomic scale data are then used to parameterize a macroscopic code, inspired on a kinetic Monte Carlo methodology dedicated to simulate vapor like deposition process. The objective is to establish the link between the microstructure of materials and the way they are achieved, i.e. the process parameters such as temperature and gas pressure. This work is conducted in the context of the integration of nano-structured energetic thermites used as micro energy source in microelectronic devices. We show that the temperature of the deposition process appears as the driving parameter to tailor the thickness of interfacial layers.

  20. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  1. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  2. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  3. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  4. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  5. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  6. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  7. Prodrugs design based on inter- and intramolecular chemical processes.

    PubMed

    Karaman, Rafik

    2013-12-01

    This review provides the reader a concise overview of the majority of prodrug approaches with the emphasis on the modern approaches to prodrug design. The chemical approach catalyzed by metabolic enzymes which is considered as widely used among all other approaches to minimize the undesirable drug physicochemical properties is discussed. Part of this review will shed light on the use of molecular orbital methods such as DFT, semiempirical and ab initio for the design of novel prodrugs. This novel prodrug approach implies prodrug design based on enzyme models that were utilized for mimicking enzyme catalysis. The computational approach exploited for the prodrug design involves molecular orbital and molecular mechanics (DFT, ab initio, and MM2) calculations and correlations between experimental and calculated values of intramolecular processes that were experimentally studied to assign the factors determining the reaction rates in certain processes for better understanding on how enzymes might exert their extraordinary catalysis.

  8. Molecular modelling and drug design.

    PubMed

    Meyer, E F; Swanson, S M; Williams, J A

    2000-03-01

    Drug design is a creative act of the same magnitude as composing, sculpting, or writing. The results can touch the lives of millions, but the creator is rarely one scientist and the rewards are distributed differently in the arts than in the sciences. The mechanisms of creativity are the same, i.e., incremental (plodding from darkness to dawn) or sudden (the "Eureka" effect) realization, but both are poorly understood. Creativity remains a human characteristic, but it is directly related to the tools available, especially computer software and hardware. While modelling software continues to mature, very little new has evolved in terms of hardware. Here, we discuss the history of molecular modelling and describe two novel modelling tools, a haptic device and a program, SCULPT, to generate solid molecular models at atomic resolution.

  9. Rapid Modeling, Assembly and Simulation in Design Optimization

    NASA Technical Reports Server (NTRS)

    Housner, Jerry

    1997-01-01

    A new capability for design is reviewed. This capability provides for rapid assembly of detail finite element models early in the design process where costs are most effectively impacted. This creates an engineering environment which enables comprehensive analysis and design optimization early in the design process. Graphical interactive computing makes it possible for the engineer to interact with the design while performing comprehensive design studies. This rapid assembly capability is enabled by the use of Interface Technology, to couple independently created models which can be archived and made accessible to the designer. Results are presented to demonstrate the capability.

  10. Optimal design of upstream processes in biotransformation technologies.

    PubMed

    Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K

    2017-01-01

    In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology.

  11. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  12. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  13. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  14. Process-induced bias: a study of resist design and process implications

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Scheer, Steven; Carcasi, Michael; Shibata, Tsuyoshi; Otsuka, Takahisa

    2008-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer variations, compensation by exposure dose and/or PEB temperature, have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In a previous study, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, both metrics demonstrated that using PEB temperature to control across wafer CD variation was preferable to using dose compensation. The previous study was limited to a single resist and variations to track and scanner processing were kept to a minimum. Further examination of additional resist materials has indicated that significant variation in dose and PEB temperature induced CD biases exist from material to material. It is the goal of this work to understand how resist design, as well as track and scanner processing, impact process induced bias (PIB). This is accomplished by analyzing full resist models for a range of resists that exhibit different dose and PEB temperature PIB behavior. From these models, the primary resist design contributors to PIB are isolated. A sensitivity analysis of the primary resist design as well as track and scanner processing effects will also be simulated and presented.

  15. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    PubMed

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  16. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  17. Optimal model-based design of the twin-column CaptureSMB process improves capacity utilization and productivity in protein A affinity capture.

    PubMed

    Baur, Daniel; Angarita, Monica; Müller-Späth, Thomas; Morbidelli, Massimo

    2016-01-01

    Multi-column chromatographic processes have recently been developed for protein A affinity chromatography to efficiently capture monoclonal antibodies from cell culture supernatant. In this work, the novel twin-column CaptureSMB process was compared to a batch capture process with dual loading flow rate to identify performance gains. As a case study, the isolation of a monoclonal antibody with the Amsphere JWT-203 protein A resin was investigated. Using model based optimization, both processes were optimized and compared over a wide range of operating conditions. A trade-off between productivity and capacity utilization was found, and the resulting pareto-curves showed that CaptureSMB dominates batch, except at very low productivity values. With a feed titer of 1.2 mg mL(-1) , CaptureSMB could reach a productivity of up to 19.5 mg mL(-1) h(-1) experimentally, while maintaining relatively high capacity utilization of 63.8%. On the other hand, at maximum capacity utilization of 95.5%, a productivity of 10.2 mg mL(-1) h(-1) could be reached. This corresponds to a performance improvement with respect batch operation of about 25% in capacity utilization and 40% in productivity, for given yield and purity. CaptureSMB therefore offers a greatly increased performance over batch capture.

  18. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  19. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  20. Using the ARCS Model To Design Multimedia College Engineering Courses.

    ERIC Educational Resources Information Center

    Shellnut, Bonnie; Savage, Timothy; Knowlton, Allie

    This paper describes how a Wayne State University (Michigan) multimedia design team is applying Keller's ARCS (Attention, Relevance, Confidence, and Satisfaction) Model of Motivational Design to the entire process of design, development, and evaluation of multimedia courseware. The ARCS Model has been applied to the prototype module and is being…

  1. Modeling of acetone biofiltration process

    SciTech Connect

    Hsiu-Mu Tang; Shyh-Jye Hwang; Wen-Chuan Wang

    1996-12-31

    The objective of this research was to investigate the kinetic behavior of the biofiltration process for the removal of acetone 41 which was used as a model compound for highly water soluble gas pollutants. A mathematical model was developed by taking into account diffusion and biodegradation of acetone and oxygen in the biofilm, mass transfer resistance in the gas film, and flow pattern of the bulk gas phase. The simulated results obtained by the proposed model indicated that mass transfer resistance in the gas phase was negligible for this biofiltration process. Analysis of the relative importance of various rate steps indicated that the overall acetone removal process was primarily limited by the oxygen diffusion rate. 11 refs., 6 figs., 1 tab.

  2. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  3. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  4. Design Exploration of Engineered Materials, Products, and Associated Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Shukla, Rishabh; Kulkarni, Nagesh H.; Gautham, B. P.; Singh, Amarendra K.; Mistree, Farrokh; Allen, Janet K.; Panchal, Jitesh H.

    2015-01-01

    In the past few years, ICME-related research has been directed towards the study of multi-scale materials design. However, relatively little has been reported on model-based methods that are of relevance to industry for the realization of engineered materials, products, and associated industrial manufacturing processes. Computational models used in the realization of engineered materials and products are fraught with uncertainty, have different levels of fidelity, are incomplete and are even likely to be inaccurate. In light of this, we adopt a robust design strategy that facilitates the exploration of the solution space thereby providing decision support to a design engineer. In this paper, we describe a foundational construct embodied in our method for design exploration, namely, the compromise Decision Support Problem. We introduce a problem that we are using to establish the efficacy of our method. It involves the integrated design of steel and gears, traversing the chain of steel making, mill production, and evolution of the material during these processes, and linking this to the mechanical design and manufacture of the gear. We provide an overview of our method to determine the operating set points for the ladle, tundish and caster operations necessary to manufacture steel of a desired set of properties. Finally, we highlight the efficacy of our method.

  5. Design and processing of organic electroluminescent devices

    NASA Astrophysics Data System (ADS)

    Pardo-Guzman, Dino Alejandro

    2000-11-01

    The present dissertation compiles three aspects of my Ph.D. work on OLED device design, fabrication and characterization. The first chapter is a review of the concepts and theories describing the mechanisms of organic electroluminescence. The second chapter makes use of these concepts to articulate some basic principles for the design of efficient and stable OLEDs. The third chapter describes the main characterization and sample preparation techniques used along this dissertation. Chapter IV describes the processing of efficient organic electroluminescent EL devices with ITO/TPD/AIQ3/Mg:Ag structures. The screen printing technique of a hole transport polymeric blend was used in an unusual mode to render thin films in the order of 60-80 nm. EL devices were then fabricated on top of these sp films to provide ~0.9% quantum efficiencies, comparable to spin coating with the same structures. Various polymer:TPD and solvent combinations were studied to find the paste with the best rheological properties. The same technique was also used to deposit a patterned MEH-PPV film. Chapter V describes my research work on the wetting of TPD on ITO substrates. The wetting was monitored by following its surface morphology evolution as a function of temperature. The effect of these surface changes was then correlated to the I-V-L characteristics of devices made with these TPD films. The surface roughness was measured with tapping AFM showed island formation at temperatures as low as 50-60°C. I Also investigated the effect of the purity of materials like AlQ3 on the device EL performance, as described in Chapter VI. In order to improve the purity of these environmentally degradable complexes a new in situ purification technique was developed with excellent enhancement of the EL cell properties. The in situ purification process was then used to purify/deposit organic dyes with improved film formation and EL characteristics.

  6. Using Mechanistic Understanding of Streambank Processes and a Deterministic Bank-Stability Model to Design and Evaluate a Reach-Scale Restoration Project

    NASA Astrophysics Data System (ADS)

    Simon, Andrew; Derrick, David; Bankhead, Natasha

    2010-05-01

    Sediment is one of the leading contributors to water-quality impairment in the United States and streambank erosion has been found to be the dominant source of sediment in many disturbed watersheds. Goodwin Creek is a typical incised channel in northeastern Mississippi, USA (4.7 m-deep) that yields about an order of magnitude more suspended sediment than stable, "reference" streams in the region. Periodic channel surveys with dating of woody vegetation in an actively eroding meander disclosed a migration rate of 0.5 m/y. Because of continued land loss by mass failure of the streambanks, a restoration project was designed to stabilize the banks. Bank retreat occurs by interactions between hydraulic forces acting at the bed and bank toe and gravitational forces acting on in situ bank material. In fact, bank-toe protection which inhibits steepening of the bank, has been found to be one of the most effective means of stabilizing the upper part of the bank. To provide a stable alternative, analysis of the restored configuration needed to mechanistically address both hydraulic erosion and geotechnical stability. This was accomplished using the Bank-Stability and Toe-Erosion Model (BSTEM). The proposed design was limited to 1:1 bank slopes due to the proximity of a road and included longitudinal stone-toe protection and bendway weirs to counter basal erosion by hydraulic shear. Worst-case conditions under the proposed design were simulated by modeling (1) typical, annual high flows (3 m-deep) to evaluate the amount of bank-toe erosion that would occur, and (2) geotechnical stability where groundwater levels were high and flow had receded to low-flow conditions in the channel (drawdown case). Stone size was selected based on a 1-D hydraulic analysis such that the stone would not be mobilized at peak flows where average boundary shear stresses can reach 60 - 80 N/m2. Calculations were made for a 3 m-deep flow at slopes between 0.002 and 0.003, resulting in recommended stone

  7. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  8. Generic process design and control strategies used to develop a dynamic model and training software for an IGCC plant with CO2 sequestration

    SciTech Connect

    Provost, G.; Stone, H.; McClintock, M.; Erbes, M.; Zitney, S.; Turton, R.; Phillips, J.; Quintrell, M.; Marasigan, J.

    2008-01-01

    To meet the growing demand for education and experience with the analysis, operation, and control of commercial-scale Integrated Gasification Combined Cycle (IGCC) plants, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a collaborative R&D project with participants from government, academia, and industry. One of the goals of this project is to develop a generic, full-scope, real-time generic IGCC dynamic plant simulator for use in establishing a world-class research and training center, as well as to promote and demonstrate the technology to power industry personnel. The NETL IGCC dynamic plant simulator will combine for the first time a process/gasification simulator and a power/combined-cycle simulator together in a single dynamic simulation framework for use in training applications as well as engineering studies. As envisioned, the simulator will have the following features and capabilities: A high-fidelity, real-time, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke Full-scope training simulator capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, and trainee performance monitoring The ability to enhance and modify the plant model to facilitate studies of changes in plant configuration and equipment and to support future R&D efforts To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which will form the basis of the simulator development. These plant sections include: Slurry Preparation Air Separation Unit Gasifiers Syngas Scrubbers Shift Reactors Gas Cooling

  9. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  10. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  11. Planning: The Participatory Process Model.

    ERIC Educational Resources Information Center

    McDowell, Elizabeth V.

    The participatory planning process model developed by Peirce Junior College is described in this paper. First, the rationale for shifting from a traditional authoritarian style of institutional leadership to a participatory style which encourages a broader concern for the institution and lessens morale problems is offered. The development of a new…

  12. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  13. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  14. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  15. Process simulation and modeling for gas processing plant

    NASA Astrophysics Data System (ADS)

    Alhameli, Falah Obaid Kenish Mubarak

    Natural gas is one of the major energy sources and its demand is increasing rapidly due to its environmental and economic advantages over other fuels. Gas processing is an essential component of natural gas system. In this work, gas processing plant is introduced with the objective of meeting pipeline gas quality. It consists of separation, sweetening and dehydration units. The separation unit contains phase separators along with stabilizer (conventional distillation column). The sweetening unit is an amine process with MDEA (Methyl DiEthanol Amine) solvent. The dehydration unit is glycol absorption with TEG (TriEthyleneGlycol) solvent. ProMaxRTM 3.2 was used to simulate the plant. Box-Behnken design was applied to build a black-box model using design of experiments (DoE). MinitabRTM 15 was used to generate and analyse the design. The chosen variables for the model were 10. They represent the gas feed conditions and units' parameters. The total runs were 170. They were successfully implemented and analysed. Total energy of the plant and water content for the product gas models were obtained. Case study was conducted to investigate the impact of H2S composition increase in the feed gas. The models were used for the case study with the objective of total energy minimization and constraint of 4 lb/MMscf for water content in the product gas. Lingo 13 was used for the optimization. It was observed that the feed pressure had the highest influence among the other parameters. Finally, some recommendations were pointed out for the future works.

  16. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is…

  17. Coupled process modeling and waste package performance

    SciTech Connect

    McGrail, B.P.; Engel, D.W.

    1992-11-01

    The interaction of borosilicate waste glasses with water has been studied extensively and reasonably good models are available that describe the reaction kinetics and solution chemical effects. Unfortunately, these models have not been utilized in performance assessment analyses, except in estimating radionuclide solubilities at the waste form surface. A geochemical model has been incorporated in the AREST code to examine the coupled processes of glass dissolution and transport within the engineering barrier system. Our calculations show that the typical assumptions used in performance assessment analyses, such as fixed solubilities or constant reaction rate at the waste form surface, do not always give conservative or realistic predictions of radionuclide release. Varying the transport properties of the waste package materials is shown to give counterintuitive effects on the release rates of some radionuclides. The use of noncoupled performance assessment models could lead a repository designer to an erroneous conclusion regarding the relative benefit of one waste package design or host rock setting over another.

  18. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  19. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  20. Biogeochemical processes in model estuaries

    NASA Astrophysics Data System (ADS)

    Church, Thomas M.

    Sixty researchers met to evaluate the effects of global change on estuaries and to improve estuarine modeling at the Second International Symposium on the Biogeochemistry of Model Estuaries, held April 15-19, 1991, at Jekyll Island, Ga. The importance of successful sampling in evaluating chemical fluxes and establishing records of estuarine change was articulated, as was the need for tracer tools for improved modeling. The symposium was sponsored by the National Science Foundation, National Oceanic and Atmospheric Administration, and the Department of Energy.Participants discussed particles and sedimentology, trace elements and metals, organic chemistry, and nutrient cycling of estuarine processes. Four days of presentations were followed by a half-day of discussion on advances in these topics and the overall goal of assessing estuarine processes in global change. What follows is a synopsis of this discussion.

  1. Conceptual design of clean processes: Tools and methods

    SciTech Connect

    Hurme, M.

    1996-12-31

    Design tools available for implementing clean design into practice are discussed. The application areas together with the methods of comparison of clean process alternatives are presented. Environmental principles are becoming increasingly important in the whole life cycle of products from design, manufacturing and marketing to disposal. The hinder of implementing clean technology in design has been the necessity to apply it in all phases of design starting from the beginning, since it deals with the major selections made in the conceptual process design. Therefore both a modified design approach and new tools are needed for process design to make the application of clean technology practical. The first item; extended process design methodologies has been presented by Hurme, Douglas, Rossiter and Klee, Hilaly and Sikdar. The aim of this paper is to discuss the latter topic; the process design tools which assist in implementing clean principles into process design. 22 refs., 2 tabs.

  2. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  3. More details...
  4. Mechanical Design Support System Based on Thinking Process Development Diagram

    NASA Astrophysics Data System (ADS)

    Mase, Hisao; Kinukawa, Hiroshi; Morii, Hiroshi; Nakao, Masayuki; Hatamura, Yotaro

    This paper describes a system that directly supports a design process in a mechanical domain. This system is based on a thinking process development diagram that draws distinctions between requirement, tasks, solutions, and implementation, which enables designers to expand and deepen their thoughts of design. The system provides five main functions that designers require in each phase of the proposed design process: (1) thinking process description support which enables designers to describe their thoughts, (2) creativity support by term association with thesauri, (3) timely display of design knowledge including know-how obtained through earlier failures, general design theories, standard-parts data, and past designs, (4) design problem solving support using 46 kinds of thinking operations, and (5) proper technology transfer support which accumulates not only design conclusions but also the design process. Though this system is applied to mechanical engineering as the first target domain, it can be easily expanded to many other domains such as architecture and electricity.

  5. Design of the HTGR for process heat applications

    SciTech Connect

    Vrable, D.L.; Quade, R.N.

    1980-05-01

    This paper discusses a design study of an advanced 842-MW(t) HTGR with a reactor outlet temperature of 850/sup 0/C (1562/sup 0/F), coupled with a chemical process whose product is hydrogen (or a mixture of hydrogen and carbon monoxide) generated by steam reforming of a light hydrocarbon mixture. This paper discusses the plant layout and design for the major components of the primary and secondary heat transfer systems. Typical parametric system study results illustrate the capability of a computer code developed to model the plant performance and economics.

  6. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  7. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  8. Design issues for population growth models

    PubMed Central

    López Fidalgo, J.; Ortiz Rodríguez, I.M.

    2010-01-01

    We briefly review and discuss design issues for population growth and decline models. We then use a flexible growth and decline model as an illustrative example and apply optimal design theory to find optimal sampling times for estimating model parameters, specific parameters and interesting functions of the model parameters for the model with two real applications. Robustness properties of the optimal designs are investigated when nominal values or the model is mis-specified, and also under a different optimality criterion. To facilitate use of optimal design ideas in practice, we also introduce a website for generating a variety of optimal designs for popular models from different disciplines. PMID:21647244

  9. Kinetic and Modeling Investigation to Provide Design Guidelines for the NREL Dilute-Acid Process Aimed at Total Hydrolysis/Fractionation of Lignocellulosic Biomass: July 1998

    SciTech Connect

    Lee, Y. Y.; Iyer, P.; Xiang, Q.; Hayes, J.

    2004-08-01

    Following up on previous work, subcontractor investigated three aspects of using NREL ''pretreatment'' technology for total hydrolysis (cellulose as well as hemicellulose) of biomass. Whereas historic hydrolysis of biomass used either dilute acid or concentrated acid technology for hydrolysis of both hemicellulose and cellulose, NREL has been pursuing very dilute acid hydrolysis of hemicellulose followed by enzymatic hydrolysis of cellulose. NREL's countercurrent shrinking-bed reactor design for hemicellulose hydrolysis (pretreatment) has, however, shown promise for total hydrolysis. For the first task, subcontractor developed a mathematical model of the countercurrent shrinking bed reactor operation and, using yellow poplar sawdust as a feedstock, analyzed the effect of: initial solid feeding rate, temperature, acid concentration, acid flow rate, Peclet number (a measure of backmixing in liquid flow), and bed shrinking. For the second task, subcontractor used laboratory trials, with yellow poplar sawdust and 0.07 wt% sulfuric acid at various temperatures, to verify the hydrolysis of cellulose to glucose (desired) and decomposition of glucose (undesired) and determine appropriate parameters for use in kinetic models. Unlike cellulose and hemicellulose, lignins, the third major component of biomass, are not carbohydrates that can be broken down into component sugars. They are, however, aromatic complex amorphous phenolic polymers that can likely be converted into low-molecular weight compounds suitable for production of fuels and chemicals. Oxidative degradation is one pathway for such conversion and hydrogen peroxide would be an attractive reagent for this, as it would leave no residuals. For the third task, subcontractor reacted lignin with hydrogen peroxide under various conditions and analyzed the resulting product mix.

  10. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  11. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  12. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  13. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  14. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  15. Theoretical Models of Astrochemical Processes

    NASA Technical Reports Server (NTRS)

    Charnley, Steven

    2009-01-01

    Interstellar chemistry provides a natural laboratory for studying exotic species and processes at densities, temperatures, and reaction rates. that are difficult or impractical to address in the laboratory. Thus, many chemical reactions considered too sloe by the standards of terrestrial chemistry, can be 'observed and modeled. Curious proposals concerning the nature and chemistry of complex interstellar organic molecules will be described. Catalytic reactions on "rain surfaces can, in principle, lead to a lame variety of species and this has motivated many laboratory and theoretical studies. Gas phase processes may also build lame species in molecular clouds. Future laboratory data and computational tools needed to construct accurate chemical models of various astronomical sources to be observed by Herschel and ALMA will be outlined.

  16. Modeling and simulation of plasma processing equipment

    NASA Astrophysics Data System (ADS)

    Kim, Heon Chang

    Currently plasma processing technology is utilized in a wide range of applications including advanced Integrated Circuit (IC) fabrication. Traditionally, plasma processing equipments have been empirically designed and optimized at great expense of development time and cost. This research proposes the development of a first principle based, multidimensional plasma process simulator with the aim of enhancing the equipment design procedure. The proposed simulator accounts for nonlinear interactions among various plasma chemistry and physics, neutral chemistry and transport, and dust transport phenomena. A three moment modeling approach is employed that shows good predictive capabilities at reasonable computational expense. For numerical efficiency, various versions of explicit and implicit Essentially Non- Oscillatory (ENO) algorithms are employed. For the rapid evaluation of time-periodic steady-state solutions, a feedback control approach is employed. Two dimensional simulation results of capacitively coupled rf plasmas show that ion bombardment uniformity can be improved through simulation based design of the plasma process. Through self-consistent simulations of an rf triode, it is also shown that effects of secondary rf voltage and frequency on ion bombardment energy can be accurately captured. These results prove that scaling relations among important process variables can be identified through the three moment modeling and simulation approach. Through coupling of the plasma model with a neutral chemistry and transport model, spatiotemporal distributions of both charged and uncharged species, including metastables, are predicted for an oxygen plasma. Furthermore, simulation results also verify the existence of a double layer in this electronegative plasma. Through Lagrangian simulation of dust in a plasma reactor, it is shown that small particles are accumulate near the center and the radial sheath boundary depending on their initial positions while large

  17. Using instructional design process to improve design and development of Internet interventions.

    PubMed

    Hilgart, Michelle M; Ritterband, Lee M; Thorndike, Frances P; Kinzie, Mable B

    2012-06-28

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  18. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  19. POLLUTION PREVENTION IN THE EARLY STAGES OF HIERARCHICAL PROCESS DESIGN

    EPA Science Inventory

    Hierarchical methods are often used in the conceptual stages of process design to synthesize and evaluate process alternatives. In this work, the methods of hierarchical process design will be focused on environmental aspects. In particular, the design methods will be coupled to ...

  20. POLLUTION PREVENTION IN THE EARLY STAGES OF HIERARCHICAL PROCESS DESIGN

    EPA Science Inventory

    Hierarchical methods are often used in the conceptual stages of process design to synthesize and evaluate process alternatives. In this work, the methods of hierarchical process design will be focused on environmental aspects. In particular, the design methods will be coupled to ...

  21. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  1. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  2. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  3. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  4. Hynol Process Engineering: Process Configuration, Site Plan, and Equipment Design

    DTIC Science & Technology

    1996-02-01

    wood, and natural gas is used as a co-feed stock. Compared with other methanol production processes, direct emissions of carbon dioxide can be...co-feedstock. Compared with other methanol production processes, direct emissions of carbon dioxide (CO 2) can be substantially reduced by using the...gas provides for reduced CO2 emissions per unit of fossil fuel carbon processed compared with separate natural gas and biomass processes. In accordance

  5. Sensory processing and world modeling for an active ranging device

    NASA Technical Reports Server (NTRS)

    Hong, Tsai-Hong; Wu, Angela Y.

    1991-01-01

    In this project, we studied world modeling and sensory processing for laser range data. World Model data representation and operation were defined. Sensory processing algorithms for point processing and linear feature detection were designed and implemented. The interface between world modeling and sensory processing in the Servo and Primitive levels was investigated and implemented. In the primitive level, linear features detectors for edges were also implemented, analyzed and compared. The existing world model representations is surveyed. Also presented is the design and implementation of the Y-frame model, a hierarchical world model. The interfaces between the world model module and the sensory processing module are discussed as well as the linear feature detectors that were designed and implemented.

  6. Development of an Integrated Process, Modeling and Simulation Platform for Performance-Based Design of Low-Energy and High IEQ Buildings

    ERIC Educational Resources Information Center

    Chen, Yixing

    2013-01-01

    The objective of this study was to develop a "Virtual Design Studio (VDS)": a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. The VDS is intended to assist collaborating architects,…

  7. Development of an Integrated Process, Modeling and Simulation Platform for Performance-Based Design of Low-Energy and High IEQ Buildings

    ERIC Educational Resources Information Center

    Chen, Yixing

    2013-01-01

    The objective of this study was to develop a "Virtual Design Studio (VDS)": a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. The VDS is intended to assist collaborating architects,…

  8. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  9. Space bioreactor: Design/process flow

    NASA Technical Reports Server (NTRS)

    Cross, John H.

    1987-01-01

    The design of the space bioreactor stems from three considerations. First, and foremost, it must sustain cells in microgravity. Closely related is the ability to take advantage of the weightlessness and microgravity. Lastly, it should fit into a bioprocess. The design of the space bioreactor is described in view of these considerations. A flow chart of the bioreactor is presented and discussed.

  10. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... substantially affected by a natural disaster in a designated disaster county. Disaster designations have been... would be considered a disaster area. This rule also revises the definition of ``natural disaster'' to be... conditions from the definition of ``natural disaster'' could lead to potential program abuse and fraud. It...

  11. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  12. Model-Based Design of Biochemical Microreactors.

    PubMed

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  13. Model-Based Design of Biochemical Microreactors

    PubMed Central

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M.; Voll, Lars M.; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  14. Modeling climate related feedback processes

    SciTech Connect

    Elzen, M.G.J. den; Rotmans, J. )

    1993-11-01

    In order to assess their impact, the feedbacks which at present can be quantified reasonably are built into the Integrated Model to Assess the Greenhouse Effect (IMAGE). Unlike previous studies, this study describes the scenario- and time-dependent role of biogeochemical feedbacks. A number of simulation experiments are performed with IMAGE to project climate changes. Besides estimates of their absolute importance, the relative importance of individual biogeochemical feedbacks is considered by calculating the gain for each feedback process. This study focuses on feedback processes in the carbon cycle and the methane (semi-) cycle. Modeled feedbacks are then used to balance the past and present carbon budget. This results in substantially lower projections for atmospheric carbon dioxide than the Intergovernmental Panel on Climate Change (IPCC) estimates. The difference is approximately 18% from the 1990 level for the IPCC [open quotes]Business-as-Usual[close quotes] scenario. Furthermore, the IPCC's [open quotes]best guess[close quotes] value of the CO[sub 2] concentration in the year 2100 falls outside the uncertainty range estimated with our balanced modeling approach. For the IPCC [open quotes]Business-as-Usual[close quotes] scenario, the calculated total gain of the feedbacks within the carbon cycle appears to be negative, a result of the dominant role of the fertilization feedback. This study also shows that if temperature feedbacks on methane emissions from wetlands, rice paddies, and hydrates do materialize, methane concentrations might be increased by 30% by 2100. 70 refs., 17 figs., 7 tabs.

  15. Multimedia Learning Design Pedagogy: A Hybrid Learning Model

    ERIC Educational Resources Information Center

    Tsoi, Mun Fie; Goh, Ngoh Khang; Chia, Lian Sai

    2005-01-01

    This paper provides insights on a hybrid learning model for multimedia learning design conceptualized from the Piagetian science learning cycle model and the Kolb's experiential learning model. This model represents learning as a cognitive process in a cycle of four phases, namely, Translating, Sculpting, Operationalizing, and Integrating and is…

  16. Model based control of polymer composite manufacturing processes

    NASA Astrophysics Data System (ADS)

    Potaraju, Sairam

    2000-10-01

    The objective of this research is to develop tools that help process engineers design, analyze and control polymeric composite manufacturing processes to achieve higher productivity and cost reduction. Current techniques for process design and control of composite manufacturing suffer from the paucity of good process models that can accurately represent these non-linear systems. Existing models developed by researchers in the past are designed to be process and operation specific, hence generating new simulation models is time consuming and requires significant effort. To address this issue, an Object Oriented Design (OOD) approach is used to develop a component-based model building framework. Process models for two commonly used industrial processes (Injected Pultrusion and Autoclave Curing) are developed using this framework to demonstrate the flexibility. Steady state and dynamic validation of this simulator is performed using a bench scale injected pultrusion process. This simulator could not be implemented online for control due to computational constraints. Models that are fast enough for online implementation, with nearly the same degree of accuracy are developed using a two-tier scheme. First, lower dimensional models that captures essential resin flow, heat transfer and cure kinetics important from a process monitoring and control standpoint are formulated. The second step is to reduce these low dimensional models to Reduced Order Models (ROM) suited for online model based estimation, control and optimization. Model reduction is carried out using Proper Orthogonal Decomposition (POD) technique in conjunction with a Galerkin formulation procedure. Subsequently, a nonlinear model-based estimation and inferential control scheme based on the ROM is implemented. In particular, this research work contributes in the following general areas: (1) Design and implementation of versatile frameworks for modeling and simulation of manufacturing processes using object

  17. Modeling Stem Cell Induction Processes

    PubMed Central

    Grácio, Filipe; Cabral, Joaquim; Tidor, Bruce

    2013-01-01

    Technology for converting human cells to pluripotent stem cell using induction processes has the potential to revolutionize regenerative medicine. However, the production of these so called iPS cells is still quite inefficient and may be dominated by stochastic effects. In this work we build mass-action models of the core regulatory elements controlling stem cell induction and maintenance. The models include not only the network of transcription factors NANOG, OCT4, SOX2, but also important epigenetic regulatory features of DNA methylation and histone modification. We show that the network topology reported in the literature is consistent with the observed experimental behavior of bistability and inducibility. Based on simulations of stem cell generation protocols, and in particular focusing on changes in epigenetic cellular states, we show that cooperative and independent reaction mechanisms have experimentally identifiable differences in the dynamics of reprogramming, and we analyze such differences and their biological basis. It had been argued that stochastic and elite models of stem cell generation represent distinct fundamental mechanisms. Work presented here suggests an alternative possibility that they represent differences in the amount of information we have about the distribution of cellular states before and during reprogramming protocols. We show further that unpredictability and variation in reprogramming decreases as the cell progresses along the induction process, and that identifiable groups of cells with elite-seeming behavior can come about by a stochastic process. Finally we show how different mechanisms and kinetic properties impact the prospects of improving the efficiency of iPS cell generation protocols. PMID:23667423

  18. Making designer mutants in model organisms.

    PubMed

    Peng, Ying; Clark, Karl J; Campbell, Jarryd M; Panetta, Magdalena R; Guo, Yi; Ekker, Stephen C

    2014-11-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.

  19. Integrated Language Design and Implementation Process

    DTIC Science & Technology

    2000-03-28

    and Technology PO Box 91000 Portland, OR 97291 DISTRIBUTION STATEMENT A 20000403 154 Approved for Public Release Distribution Unlimited Software Design...Experimental Software Engineering Science and Technology Sauerwiesen 6 P.O. Box 91000 D-67661 Kaiserslautern, Germany Portland, OR 97291-1000 USA widen...the language technology of the target (MSL) [28] developed as part of the Software Design for environment. Reliability and Reuse (SDRR) project [4

  20. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  1. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    SciTech Connect

    Currier, R.P.

    1994-10-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported.

  2. Collapse models and perceptual processes

    NASA Astrophysics Data System (ADS)

    Carlo Ghirardi, Gian; Romano, Raffaele

    2014-04-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  3. Study on Product Innovative Design Process Driven by Ideal Solution

    NASA Astrophysics Data System (ADS)

    Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui

    Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.

  4. Design-Tradeoff Model For Space Station

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Smith, Jeffrey L.; Borden, Chester S.; Deshpande, Govind K.; Fox, George; Duquette, William H.; Dilullo, Larry A.; Seeley, Larry; Shishko, Robert

    1990-01-01

    System Design Tradeoff Model (SDTM) computer program produces information which helps to enforce consistency of design objectives throughout system. Mathematical model of set of possible designs for Space Station Freedom. Program finds particular design enabling station to provide specified amounts of resources to users at lowest total (or life-cycle) cost. Compares alternative design concepts by changing set of possible designs, while holding specified services to users constant, and then comparing costs. Finally, both costs and services varied simultaneously when comparing different designs. Written in Turbo C 2.0.

  5. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  6. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  7. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  8. Knowledge and Processes in Design. DPS Final Report.

    ERIC Educational Resources Information Center

    Pirolli, Peter

    Four papers from a project concerning information-processing characterizations of the knowledge and processes involved in design are presented. The project collected and analyzed verbal protocols from instructional designers, architects, and mechanical engineers. A framework was developed for characterizing the problem spaces of design that…

  9. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  10. Course Design Using an Authentic Studio Model

    ERIC Educational Resources Information Center

    Wilson, Jay R.

    2013-01-01

    Educational Technology and Design 879 is a graduate course that introduces students to the basics of video design and production. In an attempt to improve the learning experience for students a redesign of the course was implemented for the summer of 2011 that incorporated an authentic design studio model. The design studio approach is based on…

  11. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 2: Combustion of brown coal from the Kansk-Achinsk Basin in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-03-01

    This paper continues with the description of study results for an improved-design steam boiler vortex furnace, for the full-scale configuration of which the numerical modeling of a three-dimensional turbulent two-phase reacting flow has been performed with allowance for all the principal heat and mass transfer processes in the torch combustion of pulverized Berezovsk brown coal from the Kansk-Achinsk Basin. The detailed distributions of velocity, temperature, concentration, and heat flux fields in different cross sections of the improved vortex furnace have been obtained. The principal thermoengineering and environmental characteristics of this furnace are given.

  12. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  13. The PIC [Process Individualization Curriculum] Model: Structure with Humanistic Goals.

    ERIC Educational Resources Information Center

    Gow, Doris T.

    This paper describes a curriculum design model to train research and development personnel under USOE-NIE funding. This design model, called PIC (Process Individualization Curriculum), was chosen for coverting on-campus courses to extra-mural self-instructional courses. The curriculum specialists who work with professors to individualize their…

  14. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  15. Design Criteria for Process Wastewater Pretreatment Facilities

    DTIC Science & Technology

    1988-05-01

    Osmosis 44 18. Oil/Water Separation 45 19. Air Stripping 47 20. Chemical Reduction 47 0 D. Utilizing and Combining Waste Streams Prior to Treatment 49 E...carbon R Granular activated carbon adsorption S Reverse osmosis Table20 PRETREATMENT PROCESS REMOVAL EFFICIENCY RANGES AVERA(;E ACHIEVABLE EFFLUENT...Donovan et al. included reverse osmosis , activated carbon adsorption, biological treatment, air stripping and chemical precipitation. The process

  16. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught…

  17. Physiological modeling for hearing aid design

    NASA Astrophysics Data System (ADS)

    Bruce, Ian C.; Young, Eric D.; Sachs, Murray B.

    2002-05-01

    Physiological data from hearing-impaired cats suggest that conventional hearing aid signal-processing schemes do not restore normal auditory-nerve responses to a vowel [Miller et al., J. Acoust. Soc. Am. 101, 3602 (1997)] and can even produce anomalous and potentially confounding patterns of activity [Schilling et al., Hear. Res. 117, 57 (1998)]. These deficits in the neural representation may account at least partially for poor speech perception in some hearing aid users. An amplification scheme has been developed that produces neural responses to a vowel more like those seen in normal cats and that reduces confounding responses [Miller et al., J. Acoust. Soc. Am. 106, 2693 (1999)]. A physiologically accurate model of the normal and impaired auditory periphery would provide simpler and quicker testing of such potential hearing aid designs. Details of such a model, based on that of Zhang et al. [J. Acoust. Soc. Am. 109, 648 (2001)], will be presented. Model predictions suggest that impairment of both outer- and inner-hair cells contribute to the degraded representation of vowels in hearing-impaired cats. The model is currently being used to develop and test a generalization of the Miller et al. speech-processing algorithm described above to running speech. [Work supported by NIDCD Grants DC00109 and DC00023.] a)Now with the Dept. of Electrical and Computer Engineering, McMaster Univ., 1280 Main St. W., Hamilton, ON L8S 4K1, Canada.

  18. Instructional Design Models: Future Trends and Issues.

    ERIC Educational Resources Information Center

    Dick, Walter

    1981-01-01

    Reviews the current status of instructional design models and discusses the implications of trends in education, training, and society that are likely to influence instructional design in the 1980s. Five references are appended. (Author/LLS)

  19. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  20. Instructional Design Models: What a Revolution!

    ERIC Educational Resources Information Center

    Faryadi, Qais

    2007-01-01

    This review examines instructional design models and the construction of knowledge. It further explores to identify the chilling benefits of these models for the inputs and outputs of knowledge transfer. This assessment also attempts to define instructional design models through the eyes and the minds of renowned scholars as well as the most…

  1. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  2. Micromachined accelerometer design, modeling and validation

    SciTech Connect

    Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

    1998-04-01

    Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

  3. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Image and Video Library

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  4. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  5. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  6. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  7. The Process of Soviet Weapons Design

    DTIC Science & Technology

    1978-03-01

    system on the BMP from an early 1940s German design. But the validity and usefulness of a theory, especially one that makes predictions about the future...when the 1940 publication of a highly significant Soviet discovery of spontaneous fission resulted in a complete lack of an American response, the...taken from I. N. Golovin , I. V. Khurchatov, Atomizdat, Moscow, 1973, and from Herbert York, The Advisors. Oppenheimer, Teller, and the Superbomb, W. H

  8. Hotspot detection and design recommendation using silicon calibrated CMP model

    NASA Astrophysics Data System (ADS)

    Hui, Colin; Wang, Xian Bin; Huang, Haigou; Katakamsetty, Ushasree; Economikos, Laertis; Fayaz, Mohammed; Greco, Stephen; Hua, Xiang; Jayathi, Subramanian; Yuan, Chi-Min; Li, Song; Mehrotra, Vikas; Chen, Kuang Han; Gbondo-Tugbawa, Tamba; Smith, Taber

    2009-03-01

    Chemical Mechanical Polishing (CMP) has been used in the manufacturing process for copper (Cu) damascene process. It is well known that dishing and erosion occur during CMP process, and they strongly depend on metal density and line width. The inherent thickness and topography variations become an increasing concern for today's designs running through advanced process nodes (sub 65nm). Excessive thickness and topography variations can have major impacts on chip yield and performance; as such they need to be accounted for during the design stage. In this paper, we will demonstrate an accurate physics based CMP model and its application for CMP-related hotspot detection. Model based checking capability is most useful to identify highly environment sensitive layouts that are prone to early process window limitation and hence failure. Model based checking as opposed to rule based checking can identify more accurately the weak points in a design and enable designers to provide improved layout for the areas with highest leverage for manufacturability improvement. Further, CMP modeling has the ability to provide information on interlevel effects such as copper puddling from underlying topography that cannot be captured in Design-for- Manufacturing (DfM) recommended rules. The model has been calibrated against the silicon produced with the 45nm process from Common Platform (IBMChartered- Samsung) technology. It is one of the earliest 45nm CMP models available today. We will show that the CMP-related hotspots can often occur around the spaces between analog macros and digital blocks in the SoC designs. With the help of the CMP model-based prediction, the design, the dummy fill or the placement of the blocks can be modified to improve planarity and eliminate CMP-related hotspots. The CMP model can be used to pass design recommendations to designers to improve chip yield and performance.

  9. Computational models of natural language processing

    SciTech Connect

    Bara, B.G.; Guida, G.

    1984-01-01

    The main concern in this work is the illustration of models for natural language processing, and the discussion of their role in the development of computational studies of language. Topics covered include the following: competence and performance in the design of natural language systems; planning and understanding speech acts by interpersonal games; a framework for integrating syntax and semantics; knowledge representation and natural language: extending the expressive power of proposition nodes; viewing parsing as word sense discrimination: a connectionist approach; a propositional language for text representation; from topic and focus of a sentence to linking in a text; language generation by computer; understanding the Chinese language; semantic primitives or meaning postulates: mental models of propositional representations; narrative complexity based on summarization algorithms; using focus to constrain language generation; and towards an integral model of language competence.

  10. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  11. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  12. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  13. Design and Processing of Electret Structures

    DTIC Science & Technology

    2009-10-31

    AND PROCESSING OF ELECTRET STRUCTURES Guiding Colloidal Crystallization in a Galvanic Micro Reactor   Figure 4: Guided colloidal aggregation...using galvanic micro reactor arrays. Scale bars = 50 µm. (A) Schematic of the micro reactor . The patterned electrode consists of a 100 nm thick gold...of fields, from nano- and microfluidics , to cloud seeding in the atmosphere, corrosion inhibition, and heterogeneous catalysis. Interesting

  14. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  15. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  16. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  17. Designing control system information models

    NASA Technical Reports Server (NTRS)

    Panin, K. I.; Zinchenko, V. P.

    1973-01-01

    Problems encountered in modeling information models are discussed, Data cover condition, functioning of the object of control, and the environment involved in the control. Other parameters needed for the model include: (1) information for forming an image of the real situation, (2) data for analyzing and evaluating an evolving situation, (3) planning actions, and (4) data for observing and evaluating the results of model realization.

  18. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    PubMed

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

  19. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  20. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...