Science.gov

Sample records for process modeling design

  1. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  2. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  3. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  4. Application of Process Modeling Tools to Ship Design

    DTIC Science & Technology

    2011-05-01

    NAVSEA Frank Waldman; LATTIX May 2011 APPLICATION OF PROCESS MODELING TOOLS TO SHIP DESIGN Report Documentation Page Form ApprovedOMB No. 0704-0188...00-00-2011 4. TITLE AND SUBTITLE Application of Process Modeling Tools to Ship Design 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...design teams – Long design schedules – Complicated acquisition procedures • We are applying commercial process modeling techniques for: – Better

  5. Process Model Construction and Optimization Using Statistical Experimental Design,

    DTIC Science & Technology

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  6. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  7. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  8. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  9. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  10. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  11. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  12. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  13. Model-based design of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Stroehlein, Guido; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    In this work we present a general procedure for the model-based optimization of a polypeptide crude mixture purification process through its application to a case of industrial relevance. This is done to show how much modeling can be beneficial to optimize complex chromatographic processes in the industrial environment. The target peptide elution profile was modeled with a two sites adsorption equilibrium isotherm exhibiting two inflection points. The variation of the isotherm parameters with the modifier concentration was accounted for. The adsorption isotherm parameters of the target peptide were obtained by the inverse method. The elution of the impurities was approximated by lumping them into pseudo-impurities and by regressing their adsorption isotherm parameters directly as a function of the corresponding parameters of the target peptide. After model calibration and validation by comparison with suitable experimental data, Pareto optimizations of the process were carried out so as to select the optimal batch process.

  14. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  15. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  16. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  17. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  18. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  19. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  20. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  1. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  2. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  3. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  4. Model-based design space determination of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    Operating a chemical process at fixed operating conditions often leads to suboptimal process performances. It is important in fact to be able to vary the process operating conditions depending upon possible changes in feed composition, products requirements or economics. This flexibility in the manufacturing process was facilitated by the publication of the PAT initiative from the U.S. FDA [1]. In this work, the implementation of Quality-by-design in the development of a chromatographic purification process is discussed. A procedure to determine the design space of the process using chromatographic modeling is presented. Moreover, the risk of batch failure and the critical process parameters (CPP) are assessed by modeling. The ideal cut strategy is adopted and therefore only yield and productivity are considered as critical quality attributes (CQA). The general trends in CQA variations within the design space are discussed. The effect of process disturbances is also considered. It is shown that process disturbances significantly decrease the design space and that only simultaneous and specific changes in multiple process parameters (i.e. critical process parameters (CPP) lead to batch failure. The reliability of the obtained results is proven by comparing the model predictions to suitable experimental data. The case study presented in this work proves the reliability of process development using a model-based approach.

  5. Overall challenges in incorporating micro-mechanical models into materials design process

    NASA Astrophysics Data System (ADS)

    Bennoura, M.; Aboutajeddine, A.

    2016-10-01

    Using materials in engineering design has historically been handled using the paradigm of selecting appropriate materials from the finite set of available material databases. Recent trends, however, have moved toward the tailoring of materials that meet the overall system performance requirements, based on a process called material design. An important building block of this process is micromechanical models that relate microstructure to proprieties. Unfortunately, these models remain short and include a lot of uncertainties from assumptions and idealizations, which, unavoidably, impacts material design strategy. In this work, candidate methods to deal with micromechanical models uncertainties and their drawbacks in material design are investigated. Robust design methods for quantifying uncertainty and managing or mitigating its impact on design performances are reviewed first. These methods include principles for classifying uncertainty, mathematical techniques for evaluating its level degree, and design methods for performing and generating design alternatives, that are relatively insensitive to sources of uncertainty and flexible for admitting design changes or variations. The last section of this paper addresses the limits of the existing approaches from material modelling perspective and identifies the research opportunities to overcome the impediment of incorporating micromechanical models in material design process.

  6. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  7. Process modelling and die design concepts for forming aircraft sheet parts

    NASA Astrophysics Data System (ADS)

    Hatipoğlu, H. A.; Alkaş, C. O.

    2016-08-01

    This study is about typical sheet metal forming processes applied in aerospace industry including flexform, stretch form and stretch draw. Each process is modelled by using finite element method for optimization. Tensile, bulge, forming limit and friction tests of commonly used materials are conducted for defining the hardening curves, yield loci, anisotropic constants, forming limit curves and friction coefficients between die and sheet. Process specific loadings and boundary conditions are applied to each model. The models are then validated by smartly designed experiments that characterize the related forming processes. Lastly, several examples are given in which those models are used to predict the forming defects before physical forming and necessary die design and process parameter changes are applied accordingly for successful forming operations.

  8. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods.

  9. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  10. A frequency response model matching method for PID controller design for processes with dead-time.

    PubMed

    Anwar, Md Nishat; Pan, Somnath

    2015-03-01

    In this paper, a PID controller design method for the integrating processes based on frequency response matching is presented. Two approaches are proposed for the controller design. In the first approach, a double feedback loop configuration is considered where the inner loop is designed with a stabilizing gain. In the outer loop, the parameters of the PID controller are obtained by frequency response matching between the closed-loop system with the PID controller and a reference model with desired specifications. In the second approach, the design is directly carried out considering a desired load-disturbance rejection model of the system. In both the approaches, two low frequency points are considered for matching the frequency response, which yield linear algebraic equations, solution of which gives the controller parameters. Several examples are taken from the literature to demonstrate the effectiveness and to compare with some well known design methods.

  11. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  12. Letter Report. Defense Waste Processing Facility Pour Spout Heaters - Conceptual Designs and Modeling

    SciTech Connect

    SK Sundaram; JM Perez, Jr.

    2000-09-06

    The Tanks Focus Area (TFA) identified a major task to address performance limitations and deficiencies of the Defense Waste Processing Facility (DWPF) now in its sixth year of operation. Design, installation, testing, monitoring, operability, and a number of other characteristics were studied by research personnel collaboratively at a number of facilities: Savannah River Technology Center (SRTC), Clemson Environmental Technologies Laboratory (CETL), Pacific Northwest National Laboratory (PNNL), and the Idaho National Engineering and Environmental Laboratory (INEEL). Because the potential limiting feature to the DWPF was identified as the pour spout/riser heater, researches on alternative design concepts originally proposed in the past were revisited. In the original works, finite element modeling was performed to evaluate temperature distribution and stress of the design currently used at the DWPF. Studies were also made to define the requirements of the design and to consider the approaches for remote removal/replacement. Their heater type/location, their remotely replaceable thermocouples, and their capabilities for remote handling characterized the five alternative designs proposed. Review comments on the alternative designs indicated a relatively wide range of advantages and disadvantages of the designs. The present report provides an overview of the design criteria, modeling results, and alternative designs. Based on a review of the past design optimization activities and an assessment of recent experience, recommendations are proposed for future consideration and improvement.

  13. A Conceptual Aerospace Vehicle Structural System Modeling, Analysis and Design Process

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek

    2007-01-01

    A process for aerospace structural concept analysis and design is presented, with examples of a blended-wing-body fuselage, a multi-bubble fuselage concept, a notional crew exploration vehicle, and a high altitude long endurance aircraft. Aerospace vehicle structures must withstand all anticipated mission loads, yet must be designed to have optimal structural weight with the required safety margins. For a viable systems study of advanced concepts, these conflicting requirements must be imposed and analyzed early in the conceptual design cycle, preferably with a high degree of fidelity. In this design process, integrated multidisciplinary analysis tools are used in a collaborative engineering environment. First, parametric solid and surface models including the internal structural layout are developed for detailed finite element analyses. Multiple design scenarios are generated for analyzing several structural configurations and material alternatives. The structural stress, deflection, strain, and margins of safety distributions are visualized and the design is improved. Over several design cycles, the refined vehicle parts and assembly models are generated. The accumulated design data is used for the structural mass comparison and concept ranking. The present application focus on the blended-wing-body vehicle structure and advanced composite material are also discussed.

  14. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity

  15. Error detection process - Model, design, and its impact on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y.-H.

    1984-01-01

    An analytical model is developed for computer error detection processes and applied to estimate their influence on system performance. Faults in the hardware, not in the design, are assumed to be the potential cause of transition to erroneous states during normal operations. The classification properties and associated recovery methods of error detection are discussed. The probability of obtaining an unreliable result is evaluated, along with the resulting computational loss. Error detection during design is considered and a feasible design space is outlined. Extension of the methods to account for the effects of extant multiple faults is indicated.

  16. Parameter-Level Data Flow Modeling Oriented to Product Design Process

    NASA Astrophysics Data System (ADS)

    Li, Shen; Shao, Xiao Dong; Zhang, Zhi Hua; Ge, Xiao Bo

    2015-12-01

    In this paper, a method of data flow modeling for a product design process oriented to data parameter is proposed. The data parameters are defined, which are classified as the basic data parameters and complex data parameters. The mechanism of the mapping relationship between different forms of documents and some basic data parameters, and a data transmission based on parameters, are constructed. Aiming at the characteristics of the iterative design process, the parameters version mechanism including node modification and iteration information is proposed. The data parameters transmission relationships are represented by a parameters network model (PNM) based on a directed graph. Finally, through the table of data parameters mapping onto the workflow node and PNM, the data ports and data links in the data flow model are generated automatically by the program. Validation in the 15-meter-diameter S/Ka frequency band antenna design process of the “Reflector, Back frame and Center part design” data flow model shows that the method can effectively shorten the time of data flow modeling and improve the data transmission efficiency.

  17. Ares Upper Stage Processes to Implement Model Based Design - Going Paperless

    NASA Technical Reports Server (NTRS)

    Gregory, Melanie

    2012-01-01

    Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.

  18. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation.

  19. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  20. A Model of Creative Design Process for Fostering Creativity of Students in Design Education

    ERIC Educational Resources Information Center

    Wong, Yi Lin; Siu, Kin Wai Michael

    2012-01-01

    Creativity, which is concerned with problem solving, is essential if we are to generate new solutions to the massive and complex problems in the unknown future. Our next generation needs an educational platform where they can be taught to possess creativity. Design education is such a way to foster students' creativity. Therefore, it is essential…

  1. Lyophilization process design space.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2013-11-01

    The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed.

  2. Physics-based process modeling, reliability prediction, and design guidelines for flip-chip devices

    NASA Astrophysics Data System (ADS)

    Michaelides, Stylianos

    Flip Chip on Board (FCOB) and Chip-Scale Packages (CSPs) are relatively new technologies that are being increasingly used in the electronic packaging industry. Compared to the more widely used face-up wirebonding and TAB technologies, flip-chips and most CSPs provide the shortest possible leads, lower inductance, higher frequency, better noise control, higher density, greater input/output (I/O), smaller device footprint and lower profile. However, due to the short history and due to the introduction of several new electronic materials, designs, and processing conditions, very limited work has been done to understand the role of material, geometry, and processing parameters on the reliability of flip-chip devices. Also, with the ever-increasing complexity of semiconductor packages and with the continued reduction in time to market, it is too costly to wait until the later stages of design and testing to discover that the reliability is not satisfactory. The objective of the research is to develop integrated process-reliability models that will take into consideration the mechanics of assembly processes to be able to determine the reliability of face-down devices under thermal cycling and long-term temperature dwelling. The models incorporate the time and temperature-dependent constitutive behavior of various materials in the assembly to be able to predict failure modes such as die cracking and solder cracking. In addition, the models account for process-induced defects and macro-micro features of the assembly. Creep-fatigue and continuum-damage mechanics models for the solder interconnects and fracture-mechanics models for the die have been used to determine the reliability of the devices. The results predicted by the models have been successfully validated against experimental data. The validated models have been used to develop qualification and test procedures for implantable medical devices. In addition, the research has helped develop innovative face

  3. A fuzzy model based adaptive PID controller design for nonlinear and uncertain processes.

    PubMed

    Savran, Aydogan; Kahraman, Gokalp

    2014-03-01

    We develop a novel adaptive tuning method for classical proportional-integral-derivative (PID) controller to control nonlinear processes to adjust PID gains, a problem which is very difficult to overcome in the classical PID controllers. By incorporating classical PID control, which is well-known in industry, to the control of nonlinear processes, we introduce a method which can readily be used by the industry. In this method, controller design does not require a first principal model of the process which is usually very difficult to obtain. Instead, it depends on a fuzzy process model which is constructed from the measured input-output data of the process. A soft limiter is used to impose industrial limits on the control input. The performance of the system is successfully tested on the bioreactor, a highly nonlinear process involving instabilities. Several tests showed the method's success in tracking, robustness to noise, and adaptation properties. We as well compared our system's performance to those of a plant with altered parameters with measurement noise, and obtained less ringing and better tracking. To conclude, we present a novel adaptive control method that is built upon the well-known PID architecture that successfully controls highly nonlinear industrial processes, even under conditions such as strong parameter variations, noise, and instabilities.

  4. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  5. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases [1]. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission [2]. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  6. Design and tuning of standard additive model based fuzzy PID controllers for multivariable process systems.

    PubMed

    Harinath, Eranda; Mann, George K I

    2008-06-01

    This paper describes a design and two-level tuning method for fuzzy proportional-integral derivative (FPID) controllers for a multivariable process where the fuzzy inference uses the inference of standard additive model. The proposed method can be used for any n x n multi-input-multi-output process and guarantees closed-loop stability. In the two-level tuning scheme, the tuning follows two steps: low-level tuning followed by high-level tuning. The low-level tuning adjusts apparent linear gains, whereas the high-level tuning changes the nonlinearity in the normalized fuzzy output. In this paper, two types of FPID configurations are considered, and their performances are evaluated by using a real-time multizone temperature control problem having a 3 x 3 process system.

  7. Process modeling and supply chain design for advanced biofuel production based on bio-oil gasification

    NASA Astrophysics Data System (ADS)

    Li, Qi

    As a potential substitute for petroleum-based fuel, second generation biofuels are playing an increasingly important role due to their economic, environmental, and social benefits. With the rapid development of biofuel industry, there has been an increasing literature on the techno-economic analysis and supply chain design for biofuel production based on a variety of production pathways. A recently proposed production pathway of advanced biofuel is to convert biomass to bio-oil at widely distributed small-scale fast pyrolysis plants, then gasify the bio-oil to syngas and upgrade the syngas to transportation fuels in centralized biorefinery. This thesis aims to investigate two types of assessments on this bio-oil gasification pathway: techno-economic analysis based on process modeling and literature data; supply chain design with a focus on optimal decisions for number of facilities to build, facility capacities and logistic decisions considering uncertainties. A detailed process modeling with corn stover as feedstock and liquid fuels as the final products is presented. Techno-economic analysis of the bio-oil gasification pathway is also discussed to assess the economic feasibility. Some preliminary results show a capital investment of 438 million dollar and minimum fuel selling price (MSP) of $5.6 per gallon of gasoline equivalent. The sensitivity analysis finds that MSP is most sensitive to internal rate of return (IRR), biomass feedstock cost, and fixed capital cost. A two-stage stochastic programming is formulated to solve the supply chain design problem considering uncertainties in biomass availability, technology advancement, and biofuel price. The first-stage makes the capital investment decisions including the locations and capacities of the decentralized fast pyrolysis plants and the centralized biorefinery while the second-stage determines the biomass and biofuel flows. The numerical results and case study illustrate that considering uncertainties can be

  8. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases.1 Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission.2 Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an Advanced Design Methods (ADM) based approach. This approach applies the concepts of Design of Experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development e ort. In order to t a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  9. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  10. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  11. The Sulfur-Iodine Cycle: Process Analysis and Design Using Comprehensive Phase Equilibrium Measurements and Modeling

    SciTech Connect

    Thies, Mark C.; O'Connell, J. P.; Gorensek, Maximilian B.

    2010-01-10

    Of the 100+ thermochemical hydrogen cycles that have been proposed, the Sulfur-Iodine (S-I) Cycle is a primary target of international interest for the centralized production of hydrogen from nuclear power. However, the cycle involves complex and highly nonideal phase behavior at extreme conditions that is only beginning to be understood and modeled for process simulation. The consequence is that current designs and efficiency projections have large uncertainties, as they are based on incomplete data that must be extrapolated from property models. This situation prevents reliable assessment of the potential viability of the system and, even more, a basis for efficient process design. The goal of this NERI award (05-006) was to generate phase-equilibrium data, property models, and comprehensive process simulations so that an accurate evaluation of the S-I Cycle could be made. Our focus was on Section III of the Cycle, where the hydrogen is produced by decomposition of hydroiodic acid (HI) in the presence of water and iodine (I2) in a reactive distillation (RD) column. The results of this project were to be transferred to the nuclear hydrogen community in the form of reliable flowsheet models for the S-I process. Many of the project objectives were achieved. At Clemson University, a unique, tantalum-based, phase-equilibrium apparatus incorporating a view cell was designed and constructed for measuring fluid-phase equilibria for mixtures of iodine, HI, and water (known as HIx) at temperatures to 350 °C and pressures to 100 bar. Such measurements were of particular interest for developing a working understanding of the expected operation of the RD column in Section III. The view cell allowed for the IR observation and discernment of vapor-liquid (VL), liquid-liquid, and liquid-liquid-vapor (LLVE) equilibria for HIx systems. For the I2-H2O system, liquid-liquid equilibrium (LLE) was discovered to exist at temperatures up to 310-315 °C, in contrast to the models and

  12. Statistics-enhanced multistage process models for integrated design &manufacturing of poly (vinyl alcohol) treated buckypaper

    NASA Astrophysics Data System (ADS)

    Wang, Kan

    Carbon nanotube (CNT) is considered a promising engineering material because of its exceptional mechanical, electrical, and thermal properties. Buckypaper (BP), a thin sheet of assembled CNTs, is an effective way to handle CNTs in macro scale. Pristine BP is a fragile material which is held together by weak van der Waals attractions among CNTs. This dissertation introduces a modified filtration based manufacturing process which uses poly (vinyl alcohol) (PVA) to treat BP. This treatment greatly improves the handleability of BP, reduces the spoilage during transferring, and shortens the production time. The multistage manufacturing process of PVA-treated BP is discussed in this dissertation, and process models are developed to predict the nanostructure of final products from the process parameters. Based on the nanostructure, a finite element based physical model for prediction of Young's modulus is also developed. This accuracy of this physical model is further improved by statistical methods. The aim of this study is to investigate and improve the scalability of the manufacturing process of PVA-treated BP. To achieve this goal, various statistical tools are employed. The unique issues in nanomanufacturing also motivate the development of new statistical tools and modification of existing tools. Those issues include the uncertainties in nanostructure characterization due to the scale, limited number experimental data due to high cost of raw materials, large variation in final product due to the random nature in structure, and the high complexity in physical models due to the small scale of structural building blocks. This dissertation addresses those issues by combining engineering field knowledge and statistical methods. The resulting statistics-enhanced physical model provides an approach to design the manufacturing process of PVA-treated BP for a targeting property and tailor the robustness of the final product by manipulating the process parameters. In addition

  13. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in

  14. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  15. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering.

    PubMed

    Shanechi, Maryam M; Orsborn, Amy L; Carmena, Jose M

    2016-04-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain's behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user's motor intention during CLDA-a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter

  16. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  17. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  18. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  19. Gaussian process based modeling and experimental design for sensor calibration in drifting environments.

    PubMed

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2015-09-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor's response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP's inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method.

  20. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  1. Thermal system design and modeling of meniscus controlled silicon growth process for solar applications

    NASA Astrophysics Data System (ADS)

    Wang, Chenlei

    The direct conversion of solar radiation to electricity by photovoltaics has a number of significant advantages as an electricity generator. That is, solar photovoltaic conversion systems tap an inexhaustible resource which is free of charge and available anywhere in the world. Roofing tile photovoltaic generation, for example, saves excess thermal heat and preserves the local heat balance. This means that a considerable reduction of thermal pollution in densely populated city areas can be attained. A semiconductor can only convert photons with the energy of the band gap with good efficiency. It is known that silicon is not at the maximum efficiency but relatively close to it. There are several main parts for the photovoltaic materials, which include, single- and poly-crystalline silicon, ribbon silicon, crystalline thin-film silicon, amorphous silicon, copper indium diselenide and related compounds, cadmium telluride, et al. In this dissertation, we focus on melt growth of the single- and poly-crystalline silicon manufactured by Czochralski (Cz) crystal growth process, and ribbon silicon produced by the edge-defined film-fed growth (EFG) process. These two methods are the most commonly used techniques for growing photovoltaic semiconductors. For each crystal growth process, we introduce the growth mechanism, growth system design, general application, and progress in the numerical simulation. Simulation results are shown for both Czochralski and EFG systems including temperature distribution of the growth system, velocity field inside the silicon melt and electromagnetic field for the EFG growth system. Magnetic field is applied on Cz system to reduce the melt convection inside crucible and this has been simulated in our numerical model. Parametric studies are performed through numerical and analytical models to investigate the relationship between heater power levels and solidification interface movement and shape. An inverse problem control scheme is developed to

  2. The PhOCoe Model--ergonomic pattern mapping in participatory design processes.

    PubMed

    Silva e Santos, Marcello

    2012-01-01

    The discipline and practice of human factors and ergonomics is quite rich in terms of the availability of analysis, development and evaluation tools and methods for its various processes. However, we lack effective instruments to either map or regulate comprehensively and effectively, cognitive and organizational related impacts, especially the environmental ones. Moreover, when ergonomic transformations through design - such as a new workstation design or even an entire new facility - is at play, ergonomics professionals tend to stay at bay, relying solely on design professionals and engineers. There is vast empirical evidence showing that participation of ergonomists as project facilitators, may contribute to an effective professional synergy amongst the various stakeholders in a multidisciplinary venue. When that happens, everyone wins - users and designers alike -because eventual conflicts, raised up in the midst of options selection, are dissipated in exchange for more convergent design alternatives. This paper presents a method for participatory design, in which users are encouraged to actively participate in the whole design process by sharing their real work activities with the design team. The negotiated results inferred from the ergonomic action and translated into a new design, are then compiled into a "Ergonomic Pattern Manual". This handbook of ergonomics-oriented design guidelines contains essential guidelines to be consulted in recurrent design project situations in which similar patterns might be used. The main drive is simple: nobody knows better than workers themselves what an adequate workplace design solution (equipment, workstation, office layout) should be.

  3. Standardization Process for Space Radiation Models Used for Space System Design

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Daly, Eamonn; Brautigam, Donald

    2005-01-01

    The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.

  4. A mathematical examination of the press model for atmospheric turbulence. [aircraft design/random processes

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1975-01-01

    The random process used to model atmospheric turbulence in aircraft response problems is examined. The first, second, and higher order probability density and characteristic functions were developed. The concepts of the Press model lead to an approximate procedure for the analysis of the response of linear dynamic systems to a class of non-Gaussian random processes. The Press model accounts for both the Gaussian and non-Gaussian forms of measured turbulence data. The nonstationary aspects of measured data are explicitly described by the transition properties of the random process. The effects of the distribution of the intensity process upon calculated exceedances are examined. It is concluded that the press model with a Gaussian intensity distribution gives a conservative prediction of limit load values.

  5. Optimized aerodynamic design process for subsonic transport wing fitted with winglets. [wind tunnel model

    NASA Technical Reports Server (NTRS)

    Kuhlman, J. M.

    1979-01-01

    The aerodynamic design of a wind-tunnel model of a wing representative of that of a subsonic jet transport aircraft, fitted with winglets, was performed using two recently developed optimal wing-design computer programs. Both potential flow codes use a vortex lattice representation of the near-field of the aerodynamic surfaces for determination of the required mean camber surfaces for minimum induced drag, and both codes use far-field induced drag minimization procedures to obtain the required spanloads. One code uses a discrete vortex wake model for this far-field drag computation, while the second uses a 2-D advanced panel wake model. Wing camber shapes for the two codes are very similar, but the resulting winglet camber shapes differ widely. Design techniques and considerations for these two wind-tunnel models are detailed, including a description of the necessary modifications of the design geometry to format it for use by a numerically controlled machine for the actual model construction.

  6. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  7. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  8. Self-optimisation and model-based design of experiments for developing a C–H activation flow process

    PubMed Central

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek

    2017-01-01

    A recently described C(sp3)–H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model. PMID:28228856

  9. Self-optimisation and model-based design of experiments for developing a C-H activation flow process.

    PubMed

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei

    2017-01-01

    A recently described C(sp(3))-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  10. A process: development of a model multiculturalism curriculum designed for mobility across geographic borders.

    PubMed

    Frels, L; Scott, J; Schramm, M A

    1997-08-01

    The Council on Accreditation Project, Nurse Anesthesia Educational Requirements and Mobility Between North American Free Trade Agreement (NAFTA) Countries, has as one of its outcomes the development of a model curriculum that would minimize educational barriers for mobility of nurse anesthetists across NAFTA geographical borders with a focus on the blending of professional and technical expertise with issues of human diversity and/or cultural differences. The overall long-term outcome of the project is to test a process. The manuscript discusses the process used in year III of the project to integrate cultural concepts into a nurse anesthesia model curriculum.

  11. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry.

  12. Incorporating Eco-Evolutionary Processes into Population Models:Design and Applications

    EPA Science Inventory

    Eco-evolutionary population models are powerful new tools for exploring howevolutionary processes influence plant and animal population dynamics andvice-versa. The need to manage for climate change and other dynamicdisturbance regimes is creating a demand for the incorporation of...

  13. Quality-by-Design: Multivariate Model for Multicomponent Quantification in Refining Process of Honey

    PubMed Central

    Li, Xiaoying; Wu, Zhisheng; Feng, Xin; Liu, Shanshan; Yu, Xiaojie; Ma, Qun; Qiao, Yanjiang

    2017-01-01

    Objective: A method for rapid analysis of the refining process of honey was developed based on near-infrared (NIR) spectroscopy. Methods: Partial least square calibration models were built for the four components after the selection of the optimal spectral pretreatment method and latent factors. Results: The models covered the samples of different temperatures and time points, therefore the models were robust and universal. Conclusions: These results highlighted that the NIR technology could extract the information of critical process and provide essential process knowledge of the honey refining process. SUMMARY A method for rapid analysis of the refining process of honey was developed based on near-infrared (NIR) spectroscopy. Abbreviation used: NIR: Near-infrared; 5-HMF: 5-hydroxymethylfurfural; RMSEP: Root mean square error of prediction; R: correlation coefficients; PRESS: prediction residual error-sum squares; TCM: Traditional Chinese medicine; HPLC: High-performance liquid chromatography; HPLC-DAD: HPLC-diode array detector; PLS: Partial least square; MSC: multiplicative scatter correction; RMSECV: Root mean square error of cross validation; RPD: Residual predictive deviation; 1D: 1st order derivative; SG: Savitzky-Golay smooth; 2D: 2nd order derivative. PMID:28216906

  14. Design of RTDA controller for industrial process using SOPDT model with minimum or non-minimum zero.

    PubMed

    Anbarasan, K; Srinivasan, K

    2015-07-01

    This research paper focuses on the design and development of simplified RTDA control law computation formulae for SOPDT process with minimum or non-minimum zero. The design of RTDA control scheme consists of three main components namely process output prediction, model prediction update and control action computation. The systematic approach for computation of the above three components for SOPDT process with minimum or non-minimum zero is developed in this paper. The design, implementation and performance evaluation of the developed controller is demonstrated via simulation examples. The closed loop equation, block diagram representation and theoretical stability derivation for RTDA controller are developed. The performance of proposed controller is compared with IMC, SPC, MPC and PID controller and it is demonstrated on Industrial non-linear CSTR process.

  15. Testing the Theoretical Design of a Health Risk Message: Reexamining the Major Tenets of the Extended Parallel Process Model

    ERIC Educational Resources Information Center

    Gore, Thomas D.; Bracken, Cheryl Campanella

    2005-01-01

    This study examined the fear control/danger control responses that are predicted by the Extended Parallel Process Model (EPPM). In a campaign designed to inform college students about the symptoms and dangers of meningitis, participants were given either a high-threat/no-efficacy or high-efficacy/no-threat health risk message, thus testing the…

  16. Model Guided Design and Development Process for an Electronic Health Record Training Program.

    PubMed

    He, Ze; Marquard, Jenna; Henneman, Elizabeth

    2016-01-01

    Effective user training is important to ensure electronic health record (EHR) implementation success. Though many previous studies report best practice principles and success and failure stories, current EHR training is largely empirically-based and often lacks theoretical guidance. In addition, the process of training development is underemphasized and underreported. A white paper by the American Medical Informatics Association called for models of user training for clinical information system implementation; existing instructional development models from learning theory provide a basis to meet this call. We describe in this paper our experiences and lessons learned as we adapted several instructional development models to guide our development of EHR user training. Specifically, we focus on two key aspects of this training development: training content and training process.

  17. Model Guided Design and Development Process for an Electronic Health Record Training Program

    PubMed Central

    He, Ze; Marquard, Jenna; Henneman, Elizabeth

    2016-01-01

    Effective user training is important to ensure electronic health record (EHR) implementation success. Though many previous studies report best practice principles and success and failure stories, current EHR training is largely empirically-based and often lacks theoretical guidance. In addition, the process of training development is underemphasized and underreported. A white paper by the American Medical Informatics Association called for models of user training for clinical information system implementation; existing instructional development models from learning theory provide a basis to meet this call. We describe in this paper our experiences and lessons learned as we adapted several instructional development models to guide our development of EHR user training. Specifically, we focus on two key aspects of this training development: training content and training process. PMID:28269940

  18. Enhancing the Design Process for Complex Space Systems through Early Integration of Risk and Variable-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri; Osburg, Jan

    2005-01-01

    An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.

  19. Paediatric CT protocol optimisation: a design of experiments to support the modelling and optimisation process.

    PubMed

    Rani, K; Jahnen, A; Noel, A; Wolf, D

    2015-07-01

    In the last decade, several studies have emphasised the need to understand and optimise the computed tomography (CT) procedures in order to reduce the radiation dose applied to paediatric patients. To evaluate the influence of the technical parameters on the radiation dose and the image quality, a statistical model has been developed using the design of experiments (DOE) method that has been successfully used in various fields (industry, biology and finance) applied to CT procedures for the abdomen of paediatric patients. A Box-Behnken DOE was used in this study. Three mathematical models (contrast-to-noise ratio, noise and CTDI vol) depending on three factors (tube current, tube voltage and level of iterative reconstruction) were developed and validated. They will serve as a basis for the development of a CT protocol optimisation model.

  20. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  1. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  2. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  3. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling

    PubMed Central

    F. Pradier, Melanie; J. R. Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners’ performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  4. Estimation of design space for an extrusion-spheronization process using response surface methodology and artificial neural network modelling.

    PubMed

    Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza

    2016-09-01

    The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented.

  5. Model-based control structure design of a full-scale WWTP under the retrofitting process.

    PubMed

    Machado, V C; Lafuente, J; Baeza, J A

    2015-01-01

    The anoxic-oxic (A/O) municipal wastewater treatment plant (WWTP) of Manresa (Catalonia, Spain) was studied for a possible conversion to an anaerobic/anoxic/oxic (A2/O) configuration to promote enhanced biological phosphorus removal. The control structure had to be redesigned to satisfy the new necessity to control phosphorus concentration, besides ammonium and nitrate concentrations (main pollutant concentrations). Thereby, decentralized control structures with proportional-integral-derivative (PID) controllers and centralized control structures with model-predictive controllers (MPC) were designed and tested. All the designed control structures had their performance systematically tested regarding effluent quality and operating costs. The centralized control structure, A2/O-3-MPC, achieved the lowest operating costs with the best effluent quality using the A2/O plant configuration for the Manresa WWTP. The controlled variables used in this control structure were ammonium in the effluent, nitrate at the end of the anoxic zone and phosphate at the end of the anaerobic zone, while the manipulated variables were the internal and external recycle flow rates and the dissolved oxygen setpoint in the aerobic reactors.

  6. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Matthew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design can have a profound impact on life-cycle cost (LCC). Widely accepted that nearly 80% of LCC is committed. Decisions made during early design must be well informed. Advanced Concepts Office (ACO) at Marshall Space Flight Center aids in decision making for launch vehicles. Provides rapid turnaround pre-phase A and phase A studies. Provides customer with preliminary vehicle sizing information, vehicle feasibility, and expected performance.

  7. Analysis and simulation of industrial distillation processes using a graphical system design model

    NASA Astrophysics Data System (ADS)

    Boca, Maria Loredana; Dobra, Remus; Dragos, Pasculescu; Ahmad, Mohammad Ayaz

    2016-12-01

    The separation column used for experimentations one model can be configured in two ways: one - two columns of different diameters placed one within the other extension, and second way, one column with set diameter [1], [2]. The column separates the carbon isotopes based on the cryogenic distillation of pure carbon monoxide, which is fed at a constant flow rate as a gas through the feeding system [1],[2]. Based on numerical control systems used in virtual instrumentation was done some simulations of the distillation process in order to obtain of the isotope 13C at high concentrations. The experimental installation for cryogenic separation can be configured from the point of view of the separation column in two ways: Cascade - two columns of different diameters and placed one in the extension of the other column, and second one column with a set diameter. It is proposed that this installation is controlled to achieve data using a data acquisition tool and professional software that will process information from the isotopic column based on a logical dedicated algorithm. Classical isotopic column will be controlled automatically, and information about the main parameters will be monitored and properly display using one program. Take in consideration the very-low operating temperature, an efficient thermal isolation vacuum jacket is necessary. Since the "elementary separation ratio" [2] is very close to unity in order to raise the (13C) isotope concentration up to a desired level, a permanent counter current of the liquid-gaseous phases of the carbon monoxide is created by the main elements of the equipment: the boiler in the bottom-side of the column and the condenser in the top-side.

  8. Exploring business process modelling paradigms and design-time to run-time transitions

    NASA Astrophysics Data System (ADS)

    Caron, Filip; Vanthienen, Jan

    2016-09-01

    The business process management literature describes a multitude of approaches (e.g. imperative, declarative or event-driven) that each result in a different mix of process flexibility, compliance, effectiveness and efficiency. Although the use of a single approach over the process lifecycle is often assumed, transitions between approaches at different phases in the process lifecycle may also be considered. This article explores several business process strategies by analysing the approaches at different phases in the process lifecycle as well as the various transitions.

  9. Introducing the "Decider" Design Process

    ERIC Educational Resources Information Center

    Prasa, Anthony R., Jr.; Del Guercio, Ryan

    2016-01-01

    Engineers are faced with solving important problems every day and must follow a step-by-step design process to arrive at solutions. Students who are taught an effective design process to apply to engineering projects begin to see problems as an engineer would, consider all ideas, and arrive at the best solution. Using an effective design process…

  10. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  11. Understanding the Process Model of Leadership: Follower Attribute Design and Assessment

    ERIC Educational Resources Information Center

    Antelo, Absael; Henderson, Richard L.; St. Clair, Norman

    2010-01-01

    Early leadership studies produced significant research findings that have helped differentiate between leader and follower personal attributes and their consequent behaviors (SEDL, 1992), but little attention was given to the follower's contribution to the leadership process. This study represents a continuation of research by Henderson, Antelo, &…

  12. Model parameters conditioning on regional hydrologic signatures for process-based design flood estimation in ungauged basins.

    NASA Astrophysics Data System (ADS)

    Biondi, Daniela; De Luca, Davide Luciano

    2015-04-01

    The use of rainfall-runoff models represents an alternative to statistical approaches (such as at-site or regional flood frequency analysis) for design flood estimation, and constitutes an answer to the increasing need for synthetic design hydrographs (SDHs) associated to a specific return period. However, the lack of streamflow observations and the consequent high uncertainty associated with parameter estimation, usually pose serious limitations to the use of process-based approaches in ungauged catchments, which in contrast represent the majority in practical applications. This work presents the application of a Bayesian procedure that, for a predefined rainfall-runoff model, allows for the assessment of posterior parameters distribution, using the limited and uncertain information available for the response of an ungauged catchment (Bulygina et al. 2009; 2011). The use of regional estimates of river flow statistics, interpreted as hydrological signatures that measure theoretically relevant system process behaviours (Gupta et al. 2008), within this framework represents a valuable option and has shown significant developments in recent literature to constrain the plausible model response and to reduce the uncertainty in ungauged basins. In this study we rely on the first three L-moments of annual streamflow maxima, for which regressions are available from previous studies (Biondi et al. 2012; Laio et al. 2011). The methodology was carried out for a catchment located in southern Italy, and used within a Monte Carlo scheme (MCs) considering both event-based and continuous simulation approaches for design flood estimation. The applied procedure offers promising perspectives to perform model calibration and uncertainty analysis in ungauged basins; moreover, in the context of design flood estimation, process-based methods coupled with MCs approach have the advantage of providing simulated floods uncertainty analysis that represents an asset in risk-based decision

  13. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  14. Modeling of the flow continuum and optimal design of control-oriented injection systems in liquid composite molding processes

    NASA Astrophysics Data System (ADS)

    Gokce, Ali

    Several methodologies are presented in this dissertation that aim to ensure successful filling of the mold cavity consistently, during the mold filling stage of Liquid Composite Molding (LCM) processes such as Resin Transfer Molding (RTM), Vacuum Assisted Resin Transfer Molding (VARTM) and Seemann Composites Resin Infusion Molding (SCRIMP). Key parameters that affect the resin flow in the mold cavity can be divided into two main groups as continuum-related parameters and injection-related parameters. Flow continuum, which consists of all the spaces resin can reach in the mold cavity, has two major components: the porous medium, which is made up of the fiber reinforcements, and the flow channels that are introduced into the flow continuum unintentionally and offer an easy flow path to the resin. The properties that characterize the porous medium and the unintentional flow channels are continuum-related parameters. The injection-related parameters include resin injection locations (gates), resin injection conditions and air drainage locations (vents). Modeling the flow continuum is crucial in predicting the resin flow in the mold cavity. In this study, permeability, the key property of the porous medium, is predicted using the Method of Cells, a proven method to predict macroscopic properties of heterogeneous materials. Unintentional flow channels, which are also called racetracking channels, are modeled using a probabilistic approach. Injection-related parameters are the key tools to influence the resin flow in the mold cavity. In this study, Branch and Bound Search is modified for single gate optimization. Due to its pertinence to injection system design, the parameters that govern gate effectiveness in steering the resin advance are studied. A combinatorial search algorithm is proposed for vent optimization. Vent optimization and gate optimization algorithms are integrated for simultaneous gate and vent optimization. Overall, these methodologies reduce the cycle

  15. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  16. Computational Modeling and High Performance Computing in Advanced Materials Processing, Synthesis, and Design

    DTIC Science & Technology

    2014-12-07

    research efforts in this project focused on the synergistic coupling of: • Computational material science and mechanics of hybrid and light weight polymeric...MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 11 A-l-l: Atomistic Modeling in Polymer Nanocomposite Systems...DETAILED TECHNICAL REPORT 16 A-1: COMPUTATIONAL MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 16 A-l-l

  17. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  18. Computational design of the basic dynamical processes of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Lamb, V. R.

    1977-01-01

    The 12-layer UCLA general circulation model encompassing troposphere and stratosphere (and superjacent 'sponge layer') is described. Prognostic variables are: surface pressure, horizontal velocity, temperature, water vapor and ozone in each layer, planetary boundary layer (PBL) depth, temperature, moisture and momentum discontinuities at PBL top, ground temperature and water storage, and mass of snow on ground. Selection of space finite-difference schemes for homogeneous incompressible flow, with/without a free surface, nonlinear two-dimensional nondivergent flow, enstrophy conserving schemes, momentum advection schemes, vertical and horizontal difference schemes, and time differencing schemes are discussed.

  19. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  20. Process of inorganic nitrogen transformation and design of kinetics model in the biological aerated filter reactor.

    PubMed

    Yan, Gang; Xu, Xia; Yao, Lirong; Lu, Liqiao; Zhao, Tingting; Zhang, Wenyi

    2011-04-01

    As one of the plug-flow reactors, biological aerated filter (BAF) reactor was divided into four sampling sectors to understand the characteristics of elemental nitrogen transformation during the reaction process, and then the different characteristics of elemental nitrogen transformation caused by different NH(3)-N loadings, biological quantities and activities in each section were obtained. The results showed that the total transformation ratio in the nitrifying reactor was more than 90% in the absence of any organic carbon resource, at the same time, more than 65% NH(3)-N in the influent were nitrified at the filter height of 70 cm below under the conditions of the influent runoff 9-19 L/h, the gas-water ratio 4-5:1, the dissolved oxygen 3.0-5.8 mg/L and the NH(3)-N load 0.28-0.48 kg NH(3)-N/m(3) d. On the base of the Eckenfelder mode, the kinetics equation of the NH(3)-N transformation along the reactor was S(e)=S(0) exp(-0.0134D/L(1.2612)).

  1. Biological neural networks as model systems for designing future parallel processing computers

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    One of the more interesting debates of the present day centers on whether human intelligence can be simulated by computer. The author works under the premise that neurons individually are not smart at all. Rather, they are physical units which are impinged upon continuously by other matter that influences the direction of voltage shifts across the units membranes. It is only the action of a great many neurons, billions in the case of the human nervous system, that intelligent behavior emerges. What is required to understand even the simplest neural system is painstaking analysis, bit by bit, of the architecture and the physiological functioning of its various parts. The biological neural network studied, the vestibular utricular and saccular maculas of the inner ear, are among the most simple of the mammalian neural networks to understand and model. While there is still a long way to go to understand even this most simple neural network in sufficient detail for extrapolation to computers and robots, a start was made. Moreover, the insights obtained and the technologies developed help advance the understanding of the more complex neural networks that underlie human intelligence.

  2. Fully Integrating the Design Process

    SciTech Connect

    T.A. Bjornard; R.S. Bean

    2008-03-01

    The basic approach to designing nuclear facilities in the United States does not currently reflect the routine consideration of proliferation resistance and international safeguards. The fully integrated design process is an approach for bringing consideration of international safeguards and proliferation resistance, together with state safeguards and security, fully into the design process from the very beginning, while integrating them sensibly and synergistically with the other project functions. In view of the recently established GNEP principles agreed to by the United States and at least eighteen other countries, this paper explores such an integrated approach, and its potential to help fulfill the new internationally driven design requirements with improved efficiencies and reduced costs.

  3. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  4. Ligand modeling and design

    SciTech Connect

    Hay, B.P.

    1997-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used in the cost-effective removal of specific radionuclides from nuclear waste streams. Organic ligands with metal ion specificity are critical components in the development of solvent extraction and ion exchange processes that are highly selective for targeted radionuclides. The traditional approach to the development of such ligands involves lengthy programs of organic synthesis and testing, which in the absence of reliable methods for screening compounds before synthesis, results in wasted research effort. The author`s approach breaks down and simplifies this costly process with the aid of computer-based molecular modeling techniques. Commercial software for organic molecular modeling is being configured to examine the interactions between organic ligands and metal ions, yielding an inexpensive, commercially or readily available computational tool that can be used to predict the structures and energies of ligand-metal complexes. Users will be able to correlate the large body of existing experimental data on structure, solution binding affinity, and metal ion selectivity to develop structural design criteria. These criteria will provide a basis for selecting ligands that can be implemented in separations technologies through collaboration with other DOE national laboratories and private industry. The initial focus will be to select ether-based ligands that can be applied to the recovery and concentration of the alkali and alkaline earth metal ions including cesium, strontium, and radium.

  5. Process simulation and design '94

    SciTech Connect

    Not Available

    1994-06-01

    This first-of-a-kind report describes today's process simulation and design technology for specific applications. It includes process names, diagrams, applications, descriptions, objectives, economics, installations, licensors, and a complete list of process submissions. Processes include: alkylation, aromatics extraction, catalytic reforming, cogeneration, dehydration, delayed coking, distillation, energy integration, catalytic cracking, gas sweetening, glycol/methanol injection, hydrocracking, NGL recovery and stabilization, solvent dewaxing, visbreaking. Equipment simulations include: amine plant, ammonia plant, heat exchangers, cooling water network, crude preheat train, crude unit, ethylene furnace, nitrogen rejection unit, refinery, sulfur plant, and VCM furnace. By-product processes include: olefins, polyethylene terephthalate, and styrene.

  6. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  7. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  8. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  9. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference

  10. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  11. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  12. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  13. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  14. Distributed Computing for Signal Processing: Modeling of Asynchronous Parallel Computation. Appendix G. On the Design and Modeling of Special Purpose Parallel Processing Systems.

    DTIC Science & Technology

    1985-05-01

    Data Corp., Cyber- Ikon Image Processing System Con- cepts., Digital Systems Division, Control Data Corp., Minneapolis, MN, Jan. 1977. [CDC77b] Control...Data Corp., Cyber- Ikon Flexible Processor Programming Textbook, Digital Systems Division, Control Data Corp., Minneap- olis, MN, Nov. 1977. [Che8O

  15. Optimal design activated sludge process by means of multi-objective optimization: case study in Benchmark Simulation Model 1 (BSM1).

    PubMed

    Chen, Wenliang; Yao, Chonghua; Lu, Xiwu

    2014-01-01

    Optimal design of activated sludge process (ASP) using multi-objective optimization was studied, and a benchmark process in Benchmark Simulation Model 1 (BSM1) was taken as a target process. The objectives of the study were to achieve four indexes of percentage of effluent violation (PEV), overall cost index (OCI), total volume and total suspended solids, making up four cases for comparative analysis. Models were solved by the non-dominated sorting genetic algorithm in MATLAB. Results show that: ineffective solutions can be rejected by adding constraints, and newly added objectives can affect the relationship between the existing objectives; taking Pareto solutions as process parameters, the performance indexes of PEV and OCI can be improved more than with the default process parameters of BSM1, especially for N removal and resistance against dynamic NH4(+)-N in influent. The results indicate that multi-objective optimization is a useful method for optimal design ASP.

  16. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  17. Designing cyclic universe models.

    PubMed

    Khoury, Justin; Steinhardt, Paul J; Turok, Neil

    2004-01-23

    The phenomenological constraints on the scalar field potential in cyclic models of the Universe are presented. We show that cyclic models require a comparable degree of tuning to that needed for inflationary models. The constraints are reduced to a set of simple design rules including "fast-roll" parameters analogous to the "slow-roll" parameters in inflation.

  18. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  19. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  20. Investigation on the Flexural Creep Stiffness Behavior of PC-ABS Material Processed by Fused Deposition Modeling Using Response Surface Definitive Screening Design

    NASA Astrophysics Data System (ADS)

    Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal

    2017-03-01

    The resistance of polymeric materials to time-dependent plastic deformation is an important requirement of the fused deposition modeling (FDM) design process, its processed products, and their application for long-term loading, durability, and reliability. The creep performance of the material and part processed by FDM is the fundamental criterion for many applications with strict dimensional stability requirements, including medical implants, electrical and electronic products, and various automotive applications. Herein, the effect of FDM fabrication conditions on the flexural creep stiffness behavior of polycarbonate-acrylonitrile-butadiene-styrene processed parts was investigated. A relatively new class of experimental design called "definitive screening design" was adopted for this investigation. The effects of process variables on flexural creep stiffness behavior were monitored, and the best suited quadratic polynomial model with high coefficient of determination ( R 2) value was developed. This study highlights the value of response surface definitive screening design in optimizing properties for the products and materials, and it demonstrates its role and potential application in material processing and additive manufacturing.

  1. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  2. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  3. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  4. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  5. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  6. Application of central composite design and artificial neural network in modeling of reactive blue 21 dye removal by photo-ozonation process.

    PubMed

    Mehrizad, Ali; Gharbani, Parvin

    2016-01-01

    The present study deals with use of central composite design (CCD) and artificial neural network (ANN) in modeling and optimization of reactive blue 21 (RB21) removal from aqueous media under photo-ozonation process. Four effective operational parameters (including: initial concentration of RB21, O(3) concentration, UV light intensity and reaction time) were chosen and the experiments were designed by CCD based on response surface methodology (RSM). The obtained results from the CCD model were used in modeling the process by ANN. Under optimum condition (O(3) concentration of 3.95 mg L(-1), UV intensity of 20.5 W m(-2), reaction time of 7.77 min and initial dye concentration of 40.21 mg L(-1)), RB21 removal efficiency reached to up 98.88%. A topology of ANN with a three-layer consisting of four input neurons, 14 hidden neurons and one output neuron was designed. The relative significance of each major factor was calculated based on the connection weights of the ANN model. Dye and ozone concentrations were the most important variables in the photo-ozonation of RB21, followed by reaction time and UV light intensity. The comparison of predicted values by CCD and ANN with experimental results showed that both methods were highly efficient in the modeling of the process.

  7. Exploring the linkage between cell culture process parameters and downstream processing utilizing a plackett-burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Chavez, Brittany K; Lute, Scott C; Read, Erik K; Rogstad, Sarah; Awotwe-Otoo, David; Brown, Matthew R; Boyne, Michael T; Brorson, Kurt A

    2017-01-01

    Linkage of upstream cell culture with downstream processing and purification is an aspect of Quality by Design crucial for efficient and consistent production of high quality biopharmaceutical proteins. In a previous Plackett-Burman screening study of parallel bioreactor cultures we evaluated main effects of 11 process variables, such as agitation, sparge rate, feeding regimens, dissolved oxygen set point, inoculation density, supplement addition, temperature, and pH shifts. In this follow-up study, we observed linkages between cell culture process parameters and downstream capture chromatography performance and subsequent antibody attributes. In depth analysis of the capture chromatography purification of harvested cell culture fluid yielded significant effects of upstream process parameters on host cell protein abundance and behavior. A variety of methods were used to characterize the antibody both after purification and buffer formulation. This analysis provided insight in to the significant impacts of upstream process parameters on aggregate formation, impurities, and protein structure. This report highlights the utility of linkage studies in identifying how changes in upstream parameters can impact downstream critical quality attributes. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:163-170, 2017.

  8. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  9. Hardware-software-co-design of parallel and distributed systems using a behavioural programming and multi-process model with high-level synthesis

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2011-05-01

    A new design methodology for parallel and distributed embedded systems is presented using the behavioural hardware compiler ConPro providing an imperative programming model based on concurrently communicating sequential processes (CSP) with an extensive set of interprocess-communication primitives and guarded atomic actions. The programming language and the compiler-based synthesis process enables the design of constrained power- and resourceaware embedded systems with pure Register-Transfer-Logic (RTL) efficiently mapped to FPGA and ASIC technologies. Concurrency is modelled explicitly on control- and datapath level. Additionally, concurrency on data-path level can be automatically explored and optimized by different schedulers. The CSP programming model can be synthesized to hardware (SoC) and software (C,ML) models and targets. A common source for both hardware and software implementation with identical functional behaviour is used. Processes and objects of the entire design can be distributed on different hardware and software platforms, for example, several FPGA components and software executed on several microprocessors, providing a parallel and distributed system. Intersystem-, interprocess-, and object communication is automatically implemented with serial links, not visible on programming level. The presented design methodology has the benefit of high modularity, freedom of choice of target technologies, and system architecture. Algorithms can be well matched to and distributed on different suitable execution platforms and implementation technologies, using a unique programming model, providing a balance of concurrency and resource complexity. An extended case study of a communication protocol used in high-density sensor-actuator networks should demonstrate and compare the design of a hardware and software target. The communication protocol is suited for high-density intra-and interchip networks.

  10. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 1: Flow aerodynamics in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-02-01

    Some results of the complex experimental and numerical study of aerodynamics and transfer processes in a vortex furnace, whose design was improved via the distributed tangential injection of fuel-air flows through the upper and lower burners, were presented. The experimental study of the aerodynamic characteristics of a spatial turbulent flow was performed on the isothermal laboratory model (at a scale of 1 : 20) of an improved vortex furnace using a laser Doppler measurement system. The comparison of experimental data with the results of the numerical modeling of an isothermal flow for the same laboratory furnace model demonstrated their agreement to be acceptable for engineering practice.

  11. The watershed-scale optimized and rearranged landscape design (WORLD) model and local biomass processing depots for sustainable biofuel production: Integrated life cycle assessments

    SciTech Connect

    Eranki, Pragnya L.; Manowitz, David H.; Bals, Bryan D.; Izaurralde, Roberto C.; Kim, Seungdo; Dale, Bruce E.

    2013-07-23

    An array of feedstock is being evaluated as potential raw material for cellulosic biofuel production. Thorough assessments are required in regional landscape settings before these feedstocks can be cultivated and sustainable management practices can be implemented. On the processing side, a potential solution to the logistical challenges of large biorefi neries is provided by a network of distributed processing facilities called local biomass processing depots. A large-scale cellulosic ethanol industry is likely to emerge soon in the United States. We have the opportunity to influence the sustainability of this emerging industry. The watershed-scale optimized and rearranged landscape design (WORLD) model estimates land allocations for different cellulosic feedstocks at biorefinery scale without displacing current animal nutrition requirements. This model also incorporates a network of the aforementioned depots. An integrated life cycle assessment is then conducted over the unified system of optimized feedstock production, processing, and associated transport operations to evaluate net energy yields (NEYs) and environmental impacts.

  12. SU-E-T-760: Tolerance Design for Site-Specific Range in Proton Patient QA Process Using the Six Sigma Model

    SciTech Connect

    Lah, J; Shin, D; Kim, G

    2015-06-15

    Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary to meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.

  13. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  14. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  15. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  16. Using an Analogical Thinking Model as an Instructional Tool to Improve Student Cognitive Ability in Architecture Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua

    2013-01-01

    Lack of creativity is a problem often plaguing students from design-related departments. Therefore, this study is intended to incorporate analogical thinking in the education of architecture design to enhance students' learning and their future career performance. First, this study explores the three aspects of architecture design curricula,…

  17. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process.

  18. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  19. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  20. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  1. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  2. "From the Formal to the Innovative": The Use of Case Studies and Sustainable Projects in Developing a Design Process Model for Educating Product/Industrial Designers

    ERIC Educational Resources Information Center

    Oakes, G. L.; Felton, A. J.; Garner, K. B.

    2006-01-01

    The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…

  3. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  4. An Integrative Model for Teaching Research Design.

    ERIC Educational Resources Information Center

    Packard, Richard D.; Dereshiwsky, Mary I.

    This paper presents a model which illustrates the cyclical and interactive nature of the basic elements of the research design process. Rather than presenting each research design component in isolation, the model emphasizes their interrelationships. A brief discussion is presented on each of the following components of the model: (1) the "words"…

  5. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  6. Gaps in the Design Process

    SciTech Connect

    Veers, Paul

    2016-10-04

    The design of offshore wind plants is a relatively new field. The move into U.S. waters will have unique environmental conditions, as well as expectations from the authorities responsible for managing the development. Wind turbines are required to test their assumed design conditions with the site conditions of the plant. There are still some outstanding issues on how we can assure that the design for both the turbine and the foundation are appropriate for the site and will have an acceptable level of risk associated with the particular installation.

  7. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  8. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  9. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  10. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  11. Three-dimensional model of the skull and the cranial bones reconstructed from CT scans designed for rapid prototyping process.

    PubMed

    Skrzat, Janusz; Spulber, Alexandru; Walocha, Jerzy

    2016-01-01

    This paper presents the effects of building mesh models of the human skull and the cranial bones from a series of CT-scans. With the aid of computer so ware, 3D reconstructions of the whole skull and segmented cranial bones were performed and visualized by surface rendering techniques. The article briefly discusses clinical and educational applications of 3D cranial models created using stereolitographic reproduction.

  12. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  13. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  14. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  15. A Model of Continuous Improvement in High Schools: A Process for Research, Innovation Design, Implementation, and Scale

    ERIC Educational Resources Information Center

    Cohen-Vogel, Lora; Cannata, Marisa; Rutledge, Stacey A.; Socol, Allison Rose

    2016-01-01

    This chapter describes a model for continuous improvement that guides the work of the National Center on Scaling Up Effective Schools, or NCSU. NCSU is a research and development center funded by the Institute for Education Sciences, the research arm of the United States Department of Education. At the core of the Center's work is an innovative…

  16. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  17. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  18. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  19. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  20. An Analysis of Algorithmic Processes and Instructional Design.

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Gerlach, Vernon S.

    1986-01-01

    Describes algorithms and shows how they can be applied to the design of instructional systems by relating them to a standard information processing model. Two studies are briefly described which tested serial and parallel processing in learning and offered guidelines for designers. Future research needs are also discussed. (LRW)

  1. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  2. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application.

  3. Model for vaccine design by prediction of B-epitopes of IEDB given perturbations in peptide sequence, in vivo process, experimental techniques, and source or host organisms.

    PubMed

    González-Díaz, Humberto; Pérez-Montoto, Lázaro G; Ubeira, Florencio M

    2014-01-01

    Perturbation methods add variation terms to a known experimental solution of one problem to approach a solution for a related problem without known exact solution. One problem of this type in immunology is the prediction of the possible action of epitope of one peptide after a perturbation or variation in the structure of a known peptide and/or other boundary conditions (host organism, biological process, and experimental assay). However, to the best of our knowledge, there are no reports of general-purpose perturbation models to solve this problem. In a recent work, we introduced a new quantitative structure-property relationship theory for the study of perturbations in complex biomolecular systems. In this work, we developed the first model able to classify more than 200,000 cases of perturbations with accuracy, sensitivity, and specificity >90% both in training and validation series. The perturbations include structural changes in >50000 peptides determined in experimental assays with boundary conditions involving >500 source organisms, >50 host organisms, >10 biological process, and >30 experimental techniques. The model may be useful for the prediction of new epitopes or the optimization of known peptides towards computational vaccine design.

  4. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  5. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  6. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  7. The Analytic Process Model for System Design and Measurement: A computer-Aided Tool for Analyzing Training Systems and other Human-Machine Systems

    DTIC Science & Technology

    1985-02-01

    performance measurement; effective- ness measurement; system populations; Bradley Infantry . Fighting Vehicle; BIFV; Analytic Process Model; APM...process model (APM) was developed from earlier models, applied in sample fashions to an existing system (the Bradley Infantry Fighting Vehicle) and...liradley Infantry Fighting Vehicle (Carrier Team Subsystem) 11 6. Example of a System Hierarchical Structure 14 7. Guidelines for Identifying

  8. IMPLEMENTING THE SAFEGUARDS-BY-DESIGN PROCESS

    SciTech Connect

    Whitaker, J Michael; McGinnis, Brent; Laughter, Mark D; Morgan, Jim; Bjornard, Trond; Bean, Robert; Durst, Phillip; Hockert, John; DeMuth, Scott; Lockwood, Dunbar

    2010-01-01

    The Safeguards-by-Design (SBD) approach incorporates safeguards into the design and construction of nuclear facilities at the very beginning of the design process. It is a systematic and structured approach for fully integrating international and national safeguards for material control and accountability (MC&A), physical protection, and other proliferation barriers into the design and construction process for nuclear facilities. Implementing SBD is primarily a project management or project coordination challenge. This paper focuses specifically on the design process; the planning, definition, organization, coordination, scheduling and interaction of the safeguards experts and stakeholders as they participate in the design and construction of a nuclear facility. It delineates the steps in a nuclear facility design and construction project in order to provide the project context within which the safeguards design activities take place, describes the involvement of the safeguards experts in the design process, the nature of their analyses, interactions and decisions, and describes the documents created and how they are used. This report highlights the project context of safeguards activities, and identifies the safeguards community (nuclear facility operator, designer/builder, state regulator, SSAC and IAEA) must accomplish in order to implement SBD within the project.

  9. Design, processing, and testing of LSI arrays for space station

    NASA Technical Reports Server (NTRS)

    Ipri, A. C.

    1976-01-01

    The applicability of a particular process for the fabrication of large scale integrated circuits is described. Test arrays were designed, built, and tested, and then utilized. A set of optimum dimensions for LSI arrays was generated. The arrays were applied to yield improvement through process innovation, and additional applications were suggested in the areas of yield prediction, yield modeling, and process reliability.

  10. Ligand modeling and design

    SciTech Connect

    Hay, B.

    1996-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used tin applications for the cost-effective removal of specific radionuclides from nuclear waste streams.

  11. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  12. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    PubMed Central

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Background Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Results In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA) was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher) followed by a double one (clinician and managers of services) in the implementation phase. Conclusion The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements. PMID:20216954

  13. An Integrated Course and Design Project in Chemical Process Design.

    ERIC Educational Resources Information Center

    Rockstraw, David A.; And Others

    1997-01-01

    Describes a chemical engineering course curriculum on process design, analysis, and simulation. Includes information regarding the sequencing of engineering design classes and the location of the classes within the degree program at New Mexico State University. Details of course content are provided. (DDR)

  14. Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara

    Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.

  15. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  16. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  17. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  18. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  19. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  20. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  1. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy.

  2. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  3. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  4. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  5. Process-based design of dynamical biological systems

    PubMed Central

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-01-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered. PMID:27686219

  6. Process-based design of dynamical biological systems

    NASA Astrophysics Data System (ADS)

    Tanevski, Jovan; Todorovski, Ljupčo; Džeroski, Sašo

    2016-09-01

    The computational design of dynamical systems is an important emerging task in synthetic biology. Given desired properties of the behaviour of a dynamical system, the task of design is to build an in-silico model of a system whose simulated be- haviour meets these properties. We introduce a new, process-based, design methodology for addressing this task. The new methodology combines a flexible process-based formalism for specifying the space of candidate designs with multi-objective optimization approaches for selecting the most appropriate among these candidates. We demonstrate that the methodology is general enough to both formulate and solve tasks of designing deterministic and stochastic systems, successfully reproducing plausible designs reported in previous studies and proposing new designs that meet the design criteria, but have not been previously considered.

  7. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  8. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  9. Rates of reaction and process design data for the Hydrocarb Process

    SciTech Connect

    Steinberg, M.; Kobayashi, Atsushi ); Tung, Yuanki )

    1992-08-01

    In support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb Process, experimental and process design data are reported. The experimental work includes the hydropryolysis of biomass and the thermal decomposition of methane in a tubular reactor. The rates of reaction and conversion were obtained at temperature and pressure conditions pertaining to a Hydrocarb Process design. A Process Simulation Computer Model was used to design the process and obtain complete energy and mass balances. Multiple feedstocks including biomass with natural gas and biomass with coal were evaluated. Additional feedstocks including green waste, sewage sludge and digester gas were also evaluated for a pilot plant unit.

  10. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  11. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  12. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  13. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.

  14. Ensuring competitive advantage with semantic design process management

    SciTech Connect

    Quazzani, A.; Bernard, A.; Bocquet, J.C.

    1996-12-31

    In the field of design assistance, it is important to improve records of design history and management of design process. Indeed, we propose a modelling approach of design process that focuses on representation of semantic actions. We have identified two types of actions: physical design actions focusing on the product (e.g., parameter creation, shaft dimensioning) and management actions that allow management of the process from the planning and control viewpoint (e.g., synchronization actions, a resource allocation for a task). A taxonomy of these actions has been established according to several criteria (granularity, fields of action ... ) selected in consideration of our process management interests. Linkage with objective and rationale is also discussed.

  15. Designing Competitive Service Models

    NASA Astrophysics Data System (ADS)

    Martinez, Veronica; Turner, Trevor

    The explosives developed in Europe in the late nineteenth and early twentieth ­century by the famous Swede and patron of the world peace prize, Alfred Nobel, were extremely durable and, apart from the introduction of the electric detonator, have remained in use with minor modifications for almost a century (Fig. 5.1a). In the 1970s a new invention started a process of change that has transformed the explosives business from being a supplier of products to a provider of a service. Survival very much depended on the agility of ICI Explosives UK, hereinafter referred to as "ICI Explosives," in adapting to the new competitive environment. Manufacturing excellence was not a solution. Innovative thinking was required to sustain the ­business as changes in technology reduced the complexity that had ­protected the business from serious competition for over a century.

  16. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  17. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  18. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  19. Conceptual design of industrial process displays.

    PubMed

    Pedersen, C R; Lind, M

    1999-11-01

    Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is

  20. Modeling Primary Atomization Processes

    DTIC Science & Technology

    2007-11-02

    I., "Generation of Ripples by Wind Blowing Over a Viscous Fluid", The Scientific Papers of Sir Geoffrey Ingram Taylor, 1963. 2. A. A. Amsden, P. J...92, 1983. 28. Jin, Xiaoshi, "Boundary Element Study on Particle Orientation Caused by the Fountain Flow in Injection Molding ", Polymer Engineering...HTPB, PE is a thermoplastic which is commonly produced via extrusion from a die in a continuous process. Hence, PE grains could be produced using

  1. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  2. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  3. Aerospace Materials Process Modelling

    DTIC Science & Technology

    1988-08-01

    des phdnombnes physico - chimiques , slors sal connus, notamment des rdactions do phase as produisant dana l intorvalle do solidification, par des...connaissance do donndos theraiques, sinai qua du comportement e~canique, physico - chimique at mdtaliurgique des pibees & order maim aussi des moules. des...W.T.Sbs 16 A NUMERICAL MODEL OF DIRECTIONAL SOLIDIFICATION OF CAST TURBINE BLADES by G,.Lammndu and L -Veruiot des Roches 17 Paper IS withdrawn Pape 19

  4. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  5. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  6. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  7. Molecular thermodynamics for chemical process design.

    PubMed

    Prausnitz, J M

    1979-08-24

    Chemical process design requires quantitative information on the equilibrium properties of a variety of fluid mixtures. Since the experimental effort needed to provide this information is often prohibitive in cost and time, chemical engineers must utilize rational estimation techniques based on limited experimental data. The basis for such techniques is molecular thermodynamics, a synthesis of classical and statistical thermodynamics, molecular physics, and physical chemistry.

  8. An Exploration of Design Students' Inspiration Process

    ERIC Educational Resources Information Center

    Dazkir, Sibel S.; Mower, Jennifer M.; Reddy-Best, Kelly L.; Pedersen, Elaine L.

    2013-01-01

    Our purpose was to explore how different sources of inspiration influenced two groups of students' inspiration process and their attitudes toward their design projects. Assigned sources of inspiration and instructor's assistance in the search for inspiration varied for two groups of students completing a small culture inspired product design…

  9. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  10. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  11. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  12. Protein designs in HP models

    NASA Astrophysics Data System (ADS)

    Gupta, Arvind; Khodabakhshi, Alireza Hadj; Maňuch, Ján; Rafiey, Arash; Stacho, Ladislav

    2009-07-01

    The inverse protein folding problem is that of designing an amino acid sequence which folds into a prescribed shape. This problem arises in drug design where a particular structure is necessary to ensure proper protein-protein interactions and could have applications in nanotechnology. A major challenge in designing proteins with native folds that attain a specific shape is to avoid proteins that have multiple native folds (unstable proteins). In this technical note we present our results on protein designs in the variant of Hydrophobic-Polar (HP) model introduced by Dill [6] on 2D square lattice. The HP model distinguishes only polar and hydrophobic monomers and only counts the number of hydrophobic contacts in the energy function. To achieve better stability of our designs we use the Hydrophobic-Polar-Cysteine (HPC) model which distinguishes the third type of monomers called "cysteines" and incorporates also the disulfid bridges (SS-bridges) into the energy function. We present stable designs in 2D square lattice and 3D hexagonal prism lattice in the HPC model.

  13. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  14. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  15. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  16. Non-Linear Instructional Design Model: Eternal, Synergistic Design and Development

    ERIC Educational Resources Information Center

    Crawford, Caroline

    2004-01-01

    Instructional design is at the heart of each educational endeavour. This process revolves around the steps through which the thoughtful productions of superior products are created. The ADDIE generic instructional design model emphasises five basic steps within the instructional design process: analyse, design, develop, implement and evaluate. The…

  17. The design of a nanolithographic process

    NASA Astrophysics Data System (ADS)

    Johannes, Matthew Steven

    This research delineates the design of a nanolithographic process for nanometer scale surface patterning. The process involves the combination of serial atomic force microscope (AFM) based nanolithography with the parallel patterning capabilities of soft lithography. The union of these two techniques provides for a unique approach to nanoscale patterning that establishes a research knowledge base and tools for future research and prototyping. To successfully design this process a number of separate research investigations were undertaken. A custom 3-axis AFM with feedback control on three positioning axes of nanometer precision was designed in order to execute nanolithographic research. This AFM system integrates a computer aided design/computer aided manufacturing (CAD/CAM) environment to allow for the direct synthesis of nanostructures and patterns using a virtual design interface. This AFM instrument was leveraged primarily to study anodization nanolithography (ANL), a nanoscale patterning technique used to generate local surface oxide layers on metals and semiconductors. Defining research focused on the automated generation of complex oxide nanoscale patterns as directed by CAD/CAM design as well as the implementation of tip-sample current feedback control during ANL to increase oxide uniformity. Concurrently, research was conducted concerning soft lithography, primarily in microcontact printing (muCP), and pertinent experimental and analytic techniques and procedures were investigated. Due to the masking abilities of the resulting oxide patterns from ANL, the results of AFM based patterning experiments are coupled with micromachining techniques to create higher aspect ratio structures at the nanoscale. These relief structures are used as master pattern molds for polymeric stamp formation to reproduce the original in a parallel fashion using muCP stamp formation and patterning. This new method of master fabrication provides for a useful alternative to

  18. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  19. Plant design: Integrating Plant and Equipment Models

    SciTech Connect

    Sloan, David; Fiveland, Woody; Zitney, S.E.; Osawe, Maxwell

    2007-08-01

    Like power plant engineers, process plant engineers must design generating units to operate efficiently, cleanly, and profitably despite fluctuating costs for raw materials and fuels. To do so, they increasingly create virtual plants to enable evaluation of design concepts without the expense of building pilot-scale or demonstration facilities. Existing computational models describe an entire plant either as a network of simplified equipment models or as a single, very detailed equipment model. The Advanced Process Engineering Co-Simulator (APECS) project (Figure 5) sponsored by the U.S. Department of Energy's National Energy Technology Laboratory (NETL) seeks to bridge the gap between models by integrating plant modeling and equipment modeling software. The goal of the effort is to provide greater insight into the performance of proposed plant designs. The software integration was done using the process-industry standard CAPE-OPEN (Computer Aided Process Engineering–Open), or CO interface. Several demonstration cases based on operating power plants confirm the viability of this co-simulation approach.

  20. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  1. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  2. Component Latent Trait Models for Test Design.

    ERIC Educational Resources Information Center

    Embretson, Susan Whitely

    Latent trait models are presented that can be used for test design in the context of a theory about the variables that underlie task performance. Examples of methods for decomposing and testing hypotheses about the theoretical variables in task performance are given. The methods can be used to determine the processing components that are involved…

  3. Optimization of Forming Processes in Microstructure Sensitive Design

    NASA Astrophysics Data System (ADS)

    Garmestani, H.; Li, D. S.

    2004-06-01

    Optimization of the forming processes from initial microstructures of raw materials to desired microstructures of final products is an important topic in materials design. Processing path model proposed in this study gives an explicit mathematical solution about how the microstructure evolves during thermomechanical processing. Based on a conservation principle in the orientation space (originally proposed by Bunge), this methodology is independent of the underlying deformation mechanisms. The evolutions of texture coefficients are modeled using a texture evolution matrix calculated from the experimental results. For the same material using the same processing method, the texture evolution matrix is the same. It does not change with the initial texture. This processing path model provides functions of processing paths and streamlines.

  4. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  5. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  6. Generic Model Host System Design

    SciTech Connect

    Chu, Chungming; Wu, Juhao; Qiang, Ji; Shen, Guobao; /Brookhaven

    2012-06-22

    There are many simulation codes for accelerator modelling; each one has some strength but not all. A platform which can host multiple modelling tools would be ideal for various purposes. The model platform along with infrastructure support can be used not only for online applications but also for offline purposes. Collaboration is formed for the effort of providing such a platform. In order to achieve such a platform, a set of common physics data structure has to be set. Application Programming Interface (API) for physics applications should also be defined within a model data provider. A preliminary platform design and prototype is discussed.

  7. MODEL OF DIFFUSERS / PERMEATORS FOR HYDROGEN PROCESSING

    SciTech Connect

    Hang, T; William Jacobs, W

    2007-08-27

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper.

  8. Mimicry of natural material designs and processes

    SciTech Connect

    Bond, G.M.; Richman, R.H.; McNaughton, W.P.

    1995-06-01

    Biological structural materials, although composed of unremarkable substances synthesized at low temperatures, often exhibit superior mechanical properties. In particular, the quality in which nearly all biologically derived materials excel is toughness. The advantageous mechanical properties are attributable to the hierarchical, composite, structural arrangements common to biological systems. Materials scientists and engineers have increasingly recognized that biological designs or processing approaches applied to man-made materials (biomimesis) may offer improvements in performance over conventional designs and fabrication methods. In this survey, the structures and processing routes of marine shells, avian eggshells, wood, bone, and insect cuticle are briefly reviewed, and biomimesis research inspired by these materials is discussed. In addition, this paper describes and summarizes the applications of biomineralization, self-assembly, and templating with proteins to the fabrication of thin ceramic films and nanostructure devices.

  9. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  10. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  11. Design flow for implementing image processing in FPGAs

    NASA Astrophysics Data System (ADS)

    Trakalo, M.; Giles, G.

    2007-04-01

    A design flow for implementing a dynamic gamma algorithm in an FPGA is described. Real-time video processing makes enormous demands on processing resources. An FPGA solution offers some advantages over commercial video chip and DSP implementation alternatives. The traditional approach to FPGA development involves a system engineer designing, modeling and verifying an algorithm and writing a specification. A hardware engineer uses the specification as a basis for coding in VHDL and testing the algorithm in the FPGA with supporting electronics. This process is work intensive and the verification of the image processing algorithm executing on the FPGA does not occur until late in the program. The described design process allows the system engineer to design and verify a true VHDL version of the algorithm, executing in an FPGA. This process yields reduced risk and development time. The process is achieved by using Xilinx System Generator in conjunction with Simulink® from The MathWorks. System Generator is a tool that bridges the gap between the high level modeling environment and the digital world of the FPGA. System Generator is used to develop the dynamic gamma algorithm for the contrast enhancement of a candidate display product. The results of this effort are to increase the dynamic range of the displayed video, resulting in a more useful image for the user.

  12. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  13. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  14. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  15. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  16. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  17. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  18. Design and Processing of Electret Structures

    DTIC Science & Technology

    2009-10-31

    corrosion rate measurements in specially designed model systems (Fig. 1). The spatial resolution is determined by the optical properties of the setup and...and optical responses. Such molecules usually contain a chain of atoms forming a conjugated π- electron system with electron donor and acceptor...electrets from monodisperse polystyrene microspheres (electrets whose charge comes from an imbalance of ions, rather than from the transfer of electrons

  19. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  20. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  1. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  2. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  3. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  4. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  5. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  6. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  7. Prodrugs design based on inter- and intramolecular chemical processes.

    PubMed

    Karaman, Rafik

    2013-12-01

    This review provides the reader a concise overview of the majority of prodrug approaches with the emphasis on the modern approaches to prodrug design. The chemical approach catalyzed by metabolic enzymes which is considered as widely used among all other approaches to minimize the undesirable drug physicochemical properties is discussed. Part of this review will shed light on the use of molecular orbital methods such as DFT, semiempirical and ab initio for the design of novel prodrugs. This novel prodrug approach implies prodrug design based on enzyme models that were utilized for mimicking enzyme catalysis. The computational approach exploited for the prodrug design involves molecular orbital and molecular mechanics (DFT, ab initio, and MM2) calculations and correlations between experimental and calculated values of intramolecular processes that were experimentally studied to assign the factors determining the reaction rates in certain processes for better understanding on how enzymes might exert their extraordinary catalysis.

  8. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  9. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field.

  10. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  11. Optimal design of upstream processes in biotransformation technologies.

    PubMed

    Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K

    2017-01-01

    In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology.

  12. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  13. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  14. Using the ARCS Model To Design Multimedia College Engineering Courses.

    ERIC Educational Resources Information Center

    Shellnut, Bonnie; Savage, Timothy; Knowlton, Allie

    This paper describes how a Wayne State University (Michigan) multimedia design team is applying Keller's ARCS (Attention, Relevance, Confidence, and Satisfaction) Model of Motivational Design to the entire process of design, development, and evaluation of multimedia courseware. The ARCS Model has been applied to the prototype module and is being…

  15. Process-induced bias: a study of resist design and process implications

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Scheer, Steven; Carcasi, Michael; Shibata, Tsuyoshi; Otsuka, Takahisa

    2008-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer variations, compensation by exposure dose and/or PEB temperature, have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In a previous study, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, both metrics demonstrated that using PEB temperature to control across wafer CD variation was preferable to using dose compensation. The previous study was limited to a single resist and variations to track and scanner processing were kept to a minimum. Further examination of additional resist materials has indicated that significant variation in dose and PEB temperature induced CD biases exist from material to material. It is the goal of this work to understand how resist design, as well as track and scanner processing, impact process induced bias (PIB). This is accomplished by analyzing full resist models for a range of resists that exhibit different dose and PEB temperature PIB behavior. From these models, the primary resist design contributors to PIB are isolated. A sensitivity analysis of the primary resist design as well as track and scanner processing effects will also be simulated and presented.

  16. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  17. Optimal model-based design of the twin-column CaptureSMB process improves capacity utilization and productivity in protein A affinity capture.

    PubMed

    Baur, Daniel; Angarita, Monica; Müller-Späth, Thomas; Morbidelli, Massimo

    2016-01-01

    Multi-column chromatographic processes have recently been developed for protein A affinity chromatography to efficiently capture monoclonal antibodies from cell culture supernatant. In this work, the novel twin-column CaptureSMB process was compared to a batch capture process with dual loading flow rate to identify performance gains. As a case study, the isolation of a monoclonal antibody with the Amsphere JWT-203 protein A resin was investigated. Using model based optimization, both processes were optimized and compared over a wide range of operating conditions. A trade-off between productivity and capacity utilization was found, and the resulting pareto-curves showed that CaptureSMB dominates batch, except at very low productivity values. With a feed titer of 1.2 mg mL(-1) , CaptureSMB could reach a productivity of up to 19.5 mg mL(-1) h(-1) experimentally, while maintaining relatively high capacity utilization of 63.8%. On the other hand, at maximum capacity utilization of 95.5%, a productivity of 10.2 mg mL(-1) h(-1) could be reached. This corresponds to a performance improvement with respect batch operation of about 25% in capacity utilization and 40% in productivity, for given yield and purity. CaptureSMB therefore offers a greatly increased performance over batch capture.

  18. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development.

  19. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  20. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  1. Design Exploration of Engineered Materials, Products, and Associated Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Shukla, Rishabh; Kulkarni, Nagesh H.; Gautham, B. P.; Singh, Amarendra K.; Mistree, Farrokh; Allen, Janet K.; Panchal, Jitesh H.

    2015-01-01

    In the past few years, ICME-related research has been directed towards the study of multi-scale materials design. However, relatively little has been reported on model-based methods that are of relevance to industry for the realization of engineered materials, products, and associated industrial manufacturing processes. Computational models used in the realization of engineered materials and products are fraught with uncertainty, have different levels of fidelity, are incomplete and are even likely to be inaccurate. In light of this, we adopt a robust design strategy that facilitates the exploration of the solution space thereby providing decision support to a design engineer. In this paper, we describe a foundational construct embodied in our method for design exploration, namely, the compromise Decision Support Problem. We introduce a problem that we are using to establish the efficacy of our method. It involves the integrated design of steel and gears, traversing the chain of steel making, mill production, and evolution of the material during these processes, and linking this to the mechanical design and manufacture of the gear. We provide an overview of our method to determine the operating set points for the ladle, tundish and caster operations necessary to manufacture steel of a desired set of properties. Finally, we highlight the efficacy of our method.

  2. Process simulation and modeling for gas processing plant

    NASA Astrophysics Data System (ADS)

    Alhameli, Falah Obaid Kenish Mubarak

    Natural gas is one of the major energy sources and its demand is increasing rapidly due to its environmental and economic advantages over other fuels. Gas processing is an essential component of natural gas system. In this work, gas processing plant is introduced with the objective of meeting pipeline gas quality. It consists of separation, sweetening and dehydration units. The separation unit contains phase separators along with stabilizer (conventional distillation column). The sweetening unit is an amine process with MDEA (Methyl DiEthanol Amine) solvent. The dehydration unit is glycol absorption with TEG (TriEthyleneGlycol) solvent. ProMaxRTM 3.2 was used to simulate the plant. Box-Behnken design was applied to build a black-box model using design of experiments (DoE). MinitabRTM 15 was used to generate and analyse the design. The chosen variables for the model were 10. They represent the gas feed conditions and units' parameters. The total runs were 170. They were successfully implemented and analysed. Total energy of the plant and water content for the product gas models were obtained. Case study was conducted to investigate the impact of H2S composition increase in the feed gas. The models were used for the case study with the objective of total energy minimization and constraint of 4 lb/MMscf for water content in the product gas. Lingo 13 was used for the optimization. It was observed that the feed pressure had the highest influence among the other parameters. Finally, some recommendations were pointed out for the future works.

  3. Using Mechanistic Understanding of Streambank Processes and a Deterministic Bank-Stability Model to Design and Evaluate a Reach-Scale Restoration Project

    NASA Astrophysics Data System (ADS)

    Simon, Andrew; Derrick, David; Bankhead, Natasha

    2010-05-01

    Sediment is one of the leading contributors to water-quality impairment in the United States and streambank erosion has been found to be the dominant source of sediment in many disturbed watersheds. Goodwin Creek is a typical incised channel in northeastern Mississippi, USA (4.7 m-deep) that yields about an order of magnitude more suspended sediment than stable, "reference" streams in the region. Periodic channel surveys with dating of woody vegetation in an actively eroding meander disclosed a migration rate of 0.5 m/y. Because of continued land loss by mass failure of the streambanks, a restoration project was designed to stabilize the banks. Bank retreat occurs by interactions between hydraulic forces acting at the bed and bank toe and gravitational forces acting on in situ bank material. In fact, bank-toe protection which inhibits steepening of the bank, has been found to be one of the most effective means of stabilizing the upper part of the bank. To provide a stable alternative, analysis of the restored configuration needed to mechanistically address both hydraulic erosion and geotechnical stability. This was accomplished using the Bank-Stability and Toe-Erosion Model (BSTEM). The proposed design was limited to 1:1 bank slopes due to the proximity of a road and included longitudinal stone-toe protection and bendway weirs to counter basal erosion by hydraulic shear. Worst-case conditions under the proposed design were simulated by modeling (1) typical, annual high flows (3 m-deep) to evaluate the amount of bank-toe erosion that would occur, and (2) geotechnical stability where groundwater levels were high and flow had receded to low-flow conditions in the channel (drawdown case). Stone size was selected based on a 1-D hydraulic analysis such that the stone would not be mobilized at peak flows where average boundary shear stresses can reach 60 - 80 N/m2. Calculations were made for a 3 m-deep flow at slopes between 0.002 and 0.003, resulting in recommended stone

  4. Coupled process modeling and waste package performance

    SciTech Connect

    McGrail, B.P.; Engel, D.W.

    1992-11-01

    The interaction of borosilicate waste glasses with water has been studied extensively and reasonably good models are available that describe the reaction kinetics and solution chemical effects. Unfortunately, these models have not been utilized in performance assessment analyses, except in estimating radionuclide solubilities at the waste form surface. A geochemical model has been incorporated in the AREST code to examine the coupled processes of glass dissolution and transport within the engineering barrier system. Our calculations show that the typical assumptions used in performance assessment analyses, such as fixed solubilities or constant reaction rate at the waste form surface, do not always give conservative or realistic predictions of radionuclide release. Varying the transport properties of the waste package materials is shown to give counterintuitive effects on the release rates of some radionuclides. The use of noncoupled performance assessment models could lead a repository designer to an erroneous conclusion regarding the relative benefit of one waste package design or host rock setting over another.

  5. Planning: The Participatory Process Model.

    ERIC Educational Resources Information Center

    McDowell, Elizabeth V.

    The participatory planning process model developed by Peirce Junior College is described in this paper. First, the rationale for shifting from a traditional authoritarian style of institutional leadership to a participatory style which encourages a broader concern for the institution and lessens morale problems is offered. The development of a new…

  6. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  7. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  8. Generic process design and control strategies used to develop a dynamic model and training software for an IGCC plant with CO2 sequestration

    SciTech Connect

    Provost, G.; Stone, H.; McClintock, M.; Erbes, M.; Zitney, S.; Turton, R.; Phillips, J.; Quintrell, M.; Marasigan, J.

    2008-01-01

    To meet the growing demand for education and experience with the analysis, operation, and control of commercial-scale Integrated Gasification Combined Cycle (IGCC) plants, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a collaborative R&D project with participants from government, academia, and industry. One of the goals of this project is to develop a generic, full-scope, real-time generic IGCC dynamic plant simulator for use in establishing a world-class research and training center, as well as to promote and demonstrate the technology to power industry personnel. The NETL IGCC dynamic plant simulator will combine for the first time a process/gasification simulator and a power/combined-cycle simulator together in a single dynamic simulation framework for use in training applications as well as engineering studies. As envisioned, the simulator will have the following features and capabilities: A high-fidelity, real-time, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke Full-scope training simulator capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, and trainee performance monitoring The ability to enhance and modify the plant model to facilitate studies of changes in plant configuration and equipment and to support future R&D efforts To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which will form the basis of the simulator development. These plant sections include: Slurry Preparation Air Separation Unit Gasifiers Syngas Scrubbers Shift Reactors Gas Cooling

  9. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  10. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  11. High Lifetime Solar Cell Processing and Design

    NASA Technical Reports Server (NTRS)

    Swanson, R. M.

    1985-01-01

    In order to maximize efficiency a solar cell must: (1) absorb as much light as possible in electron-hole production, (2) transport as large a fraction as possible of the electrons to the n-type terminal and holes to the p-type terminal without their first recombining, and (3) produce as high as possible terminal voltage. Step (1) is largely fixed by the spectrum of sunlight and the fundamental absorption characteristics of silicon, although some improvements are possible through texturizing induced light trapping and back surface reflectors. Steps (2) and (3) are, however, dependent on the recombination mechanisms of the cell. The recombination, on the contrary, is strongly influenced by cell processing and design. Some of the lessons during the development of point-contact-cell are discussed. Cell dependence on recombination, surface recombination, and contact recombination are discussed. Results show the overwhelming influence of contact recombination on the operation of the cell when the other sources of recombination are reduced by careful processing.

  12. Clutter suppression interferometry system design and processing

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2015-05-01

    Clutter suppression interferometry (CSI) has received extensive attention due to its multi-modal capability to detect slow-moving targets, and concurrently form high-resolution synthetic aperture radar (SAR) images from the same data. The ability to continuously augment SAR images with geo-located ground moving target indicators (GMTI) provides valuable real-time situational awareness that is important for many applications. CSI can be accomplished with minimal hardware and processing resources. This makes CSI a natural candidate for applications where size, weight and power (SWaP) are constrained, such as unmanned aerial vehicles (UAVs) and small satellites. This paper will discuss the theory for optimal CSI system configuration focusing on sparse time-varying transmit and receive array manifold due to SWaP considerations. The underlying signal model will be presented and discussed as well as the potential benefits that a sparse time-varying transmit receive manifold provides. The high-level processing objectives will be detailed and examined on simulated data. Then actual SAR data collected with the Space Dynamic Laboratory (SDL) FlexSAR radar system will be analyzed. The simulated data contrasted with actual SAR data helps illustrate the challenges and limitations found in practice vs. theory. A new novel approach incorporating sparse signal processing is discussed that has the potential to reduce false- alarm rates and improve detections.

  13. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  14. Conceptual design of clean processes: Tools and methods

    SciTech Connect

    Hurme, M.

    1996-12-31

    Design tools available for implementing clean design into practice are discussed. The application areas together with the methods of comparison of clean process alternatives are presented. Environmental principles are becoming increasingly important in the whole life cycle of products from design, manufacturing and marketing to disposal. The hinder of implementing clean technology in design has been the necessity to apply it in all phases of design starting from the beginning, since it deals with the major selections made in the conceptual process design. Therefore both a modified design approach and new tools are needed for process design to make the application of clean technology practical. The first item; extended process design methodologies has been presented by Hurme, Douglas, Rossiter and Klee, Hilaly and Sikdar. The aim of this paper is to discuss the latter topic; the process design tools which assist in implementing clean principles into process design. 22 refs., 2 tabs.

  15. Mechanical Design Support System Based on Thinking Process Development Diagram

    NASA Astrophysics Data System (ADS)

    Mase, Hisao; Kinukawa, Hiroshi; Morii, Hiroshi; Nakao, Masayuki; Hatamura, Yotaro

    This paper describes a system that directly supports a design process in a mechanical domain. This system is based on a thinking process development diagram that draws distinctions between requirement, tasks, solutions, and implementation, which enables designers to expand and deepen their thoughts of design. The system provides five main functions that designers require in each phase of the proposed design process: (1) thinking process description support which enables designers to describe their thoughts, (2) creativity support by term association with thesauri, (3) timely display of design knowledge including know-how obtained through earlier failures, general design theories, standard-parts data, and past designs, (4) design problem solving support using 46 kinds of thinking operations, and (5) proper technology transfer support which accumulates not only design conclusions but also the design process. Though this system is applied to mechanical engineering as the first target domain, it can be easily expanded to many other domains such as architecture and electricity.

  16. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  17. Universal Design: Process, Principles, and Applications

    ERIC Educational Resources Information Center

    Burgstahler, Sheryl

    2009-01-01

    Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is…

  18. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  19. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  20. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  1. Design issues for population growth models

    PubMed Central

    López Fidalgo, J.; Ortiz Rodríguez, I.M.

    2010-01-01

    We briefly review and discuss design issues for population growth and decline models. We then use a flexible growth and decline model as an illustrative example and apply optimal design theory to find optimal sampling times for estimating model parameters, specific parameters and interesting functions of the model parameters for the model with two real applications. Robustness properties of the optimal designs are investigated when nominal values or the model is mis-specified, and also under a different optimality criterion. To facilitate use of optimal design ideas in practice, we also introduce a website for generating a variety of optimal designs for popular models from different disciplines. PMID:21647244

  2. Kinetic and Modeling Investigation to Provide Design Guidelines for the NREL Dilute-Acid Process Aimed at Total Hydrolysis/Fractionation of Lignocellulosic Biomass: July 1998

    SciTech Connect

    Lee, Y. Y.; Iyer, P.; Xiang, Q.; Hayes, J.

    2004-08-01

    Following up on previous work, subcontractor investigated three aspects of using NREL ''pretreatment'' technology for total hydrolysis (cellulose as well as hemicellulose) of biomass. Whereas historic hydrolysis of biomass used either dilute acid or concentrated acid technology for hydrolysis of both hemicellulose and cellulose, NREL has been pursuing very dilute acid hydrolysis of hemicellulose followed by enzymatic hydrolysis of cellulose. NREL's countercurrent shrinking-bed reactor design for hemicellulose hydrolysis (pretreatment) has, however, shown promise for total hydrolysis. For the first task, subcontractor developed a mathematical model of the countercurrent shrinking bed reactor operation and, using yellow poplar sawdust as a feedstock, analyzed the effect of: initial solid feeding rate, temperature, acid concentration, acid flow rate, Peclet number (a measure of backmixing in liquid flow), and bed shrinking. For the second task, subcontractor used laboratory trials, with yellow poplar sawdust and 0.07 wt% sulfuric acid at various temperatures, to verify the hydrolysis of cellulose to glucose (desired) and decomposition of glucose (undesired) and determine appropriate parameters for use in kinetic models. Unlike cellulose and hemicellulose, lignins, the third major component of biomass, are not carbohydrates that can be broken down into component sugars. They are, however, aromatic complex amorphous phenolic polymers that can likely be converted into low-molecular weight compounds suitable for production of fuels and chemicals. Oxidative degradation is one pathway for such conversion and hydrogen peroxide would be an attractive reagent for this, as it would leave no residuals. For the third task, subcontractor reacted lignin with hydrogen peroxide under various conditions and analyzed the resulting product mix.

  3. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  4. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  5. Sensory processing and world modeling for an active ranging device

    NASA Technical Reports Server (NTRS)

    Hong, Tsai-Hong; Wu, Angela Y.

    1991-01-01

    In this project, we studied world modeling and sensory processing for laser range data. World Model data representation and operation were defined. Sensory processing algorithms for point processing and linear feature detection were designed and implemented. The interface between world modeling and sensory processing in the Servo and Primitive levels was investigated and implemented. In the primitive level, linear features detectors for edges were also implemented, analyzed and compared. The existing world model representations is surveyed. Also presented is the design and implementation of the Y-frame model, a hierarchical world model. The interfaces between the world model module and the sensory processing module are discussed as well as the linear feature detectors that were designed and implemented.

  6. Using Instructional Design Process to Improve Design and Development of Internet Interventions

    PubMed Central

    Hilgart, Michelle M; Thorndike, Frances P; Kinzie, Mable B

    2012-01-01

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  7. POLLUTION PREVENTION IN THE EARLY STAGES OF HIERARCHICAL PROCESS DESIGN

    EPA Science Inventory

    Hierarchical methods are often used in the conceptual stages of process design to synthesize and evaluate process alternatives. In this work, the methods of hierarchical process design will be focused on environmental aspects. In particular, the design methods will be coupled to ...

  8. Developing a 3D Game Design Authoring Package to Assist Students' Visualization Process in Design Thinking

    ERIC Educational Resources Information Center

    Kuo, Ming-Shiou; Chuang, Tsung-Yen

    2013-01-01

    The teaching of 3D digital game design requires the development of students' meta-skills, from story creativity to 3D model construction, and even the visualization process in design thinking. The characteristics a good game designer should possess have been identified as including redesign things, creativity thinking and the ability to…

  9. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  10. Model based control of polymer composite manufacturing processes

    NASA Astrophysics Data System (ADS)

    Potaraju, Sairam

    2000-10-01

    The objective of this research is to develop tools that help process engineers design, analyze and control polymeric composite manufacturing processes to achieve higher productivity and cost reduction. Current techniques for process design and control of composite manufacturing suffer from the paucity of good process models that can accurately represent these non-linear systems. Existing models developed by researchers in the past are designed to be process and operation specific, hence generating new simulation models is time consuming and requires significant effort. To address this issue, an Object Oriented Design (OOD) approach is used to develop a component-based model building framework. Process models for two commonly used industrial processes (Injected Pultrusion and Autoclave Curing) are developed using this framework to demonstrate the flexibility. Steady state and dynamic validation of this simulator is performed using a bench scale injected pultrusion process. This simulator could not be implemented online for control due to computational constraints. Models that are fast enough for online implementation, with nearly the same degree of accuracy are developed using a two-tier scheme. First, lower dimensional models that captures essential resin flow, heat transfer and cure kinetics important from a process monitoring and control standpoint are formulated. The second step is to reduce these low dimensional models to Reduced Order Models (ROM) suited for online model based estimation, control and optimization. Model reduction is carried out using Proper Orthogonal Decomposition (POD) technique in conjunction with a Galerkin formulation procedure. Subsequently, a nonlinear model-based estimation and inferential control scheme based on the ROM is implemented. In particular, this research work contributes in the following general areas: (1) Design and implementation of versatile frameworks for modeling and simulation of manufacturing processes using object

  11. Modeling climate related feedback processes

    SciTech Connect

    Elzen, M.G.J. den; Rotmans, J. )

    1993-11-01

    In order to assess their impact, the feedbacks which at present can be quantified reasonably are built into the Integrated Model to Assess the Greenhouse Effect (IMAGE). Unlike previous studies, this study describes the scenario- and time-dependent role of biogeochemical feedbacks. A number of simulation experiments are performed with IMAGE to project climate changes. Besides estimates of their absolute importance, the relative importance of individual biogeochemical feedbacks is considered by calculating the gain for each feedback process. This study focuses on feedback processes in the carbon cycle and the methane (semi-) cycle. Modeled feedbacks are then used to balance the past and present carbon budget. This results in substantially lower projections for atmospheric carbon dioxide than the Intergovernmental Panel on Climate Change (IPCC) estimates. The difference is approximately 18% from the 1990 level for the IPCC [open quotes]Business-as-Usual[close quotes] scenario. Furthermore, the IPCC's [open quotes]best guess[close quotes] value of the CO[sub 2] concentration in the year 2100 falls outside the uncertainty range estimated with our balanced modeling approach. For the IPCC [open quotes]Business-as-Usual[close quotes] scenario, the calculated total gain of the feedbacks within the carbon cycle appears to be negative, a result of the dominant role of the fertilization feedback. This study also shows that if temperature feedbacks on methane emissions from wetlands, rice paddies, and hydrates do materialize, methane concentrations might be increased by 30% by 2100. 70 refs., 17 figs., 7 tabs.

  12. Development of an Integrated Process, Modeling and Simulation Platform for Performance-Based Design of Low-Energy and High IEQ Buildings

    ERIC Educational Resources Information Center

    Chen, Yixing

    2013-01-01

    The objective of this study was to develop a "Virtual Design Studio (VDS)": a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. The VDS is intended to assist collaborating architects,…

  13. Modeling Stem Cell Induction Processes

    PubMed Central

    Grácio, Filipe; Cabral, Joaquim; Tidor, Bruce

    2013-01-01

    Technology for converting human cells to pluripotent stem cell using induction processes has the potential to revolutionize regenerative medicine. However, the production of these so called iPS cells is still quite inefficient and may be dominated by stochastic effects. In this work we build mass-action models of the core regulatory elements controlling stem cell induction and maintenance. The models include not only the network of transcription factors NANOG, OCT4, SOX2, but also important epigenetic regulatory features of DNA methylation and histone modification. We show that the network topology reported in the literature is consistent with the observed experimental behavior of bistability and inducibility. Based on simulations of stem cell generation protocols, and in particular focusing on changes in epigenetic cellular states, we show that cooperative and independent reaction mechanisms have experimentally identifiable differences in the dynamics of reprogramming, and we analyze such differences and their biological basis. It had been argued that stochastic and elite models of stem cell generation represent distinct fundamental mechanisms. Work presented here suggests an alternative possibility that they represent differences in the amount of information we have about the distribution of cellular states before and during reprogramming protocols. We show further that unpredictability and variation in reprogramming decreases as the cell progresses along the induction process, and that identifiable groups of cells with elite-seeming behavior can come about by a stochastic process. Finally we show how different mechanisms and kinetic properties impact the prospects of improving the efficiency of iPS cell generation protocols. PMID:23667423

  14. Multimedia Learning Design Pedagogy: A Hybrid Learning Model

    ERIC Educational Resources Information Center

    Tsoi, Mun Fie; Goh, Ngoh Khang; Chia, Lian Sai

    2005-01-01

    This paper provides insights on a hybrid learning model for multimedia learning design conceptualized from the Piagetian science learning cycle model and the Kolb's experiential learning model. This model represents learning as a cognitive process in a cycle of four phases, namely, Translating, Sculpting, Operationalizing, and Integrating and is…

  15. Model-Based Design of Biochemical Microreactors

    PubMed Central

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M.; Voll, Lars M.; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  16. Model-Based Design of Biochemical Microreactors.

    PubMed

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  17. Hynol Process Engineering: Process Configuration, Site Plan, and Equipment Design

    DTIC Science & Technology

    1996-02-01

    wood, and natural gas is used as a co-feed stock. Compared with other methanol production processes, direct emissions of carbon dioxide can be...co-feedstock. Compared with other methanol production processes, direct emissions of carbon dioxide (CO 2) can be substantially reduced by using the...gas provides for reduced CO2 emissions per unit of fossil fuel carbon processed compared with separate natural gas and biomass processes. In accordance

  18. Lunar fiberglass: Properties and process design

    NASA Technical Reports Server (NTRS)

    Dalton, Robert; Nichols, Todd

    1987-01-01

    A Clemson University ceramic engineering design for a lunar fiberglass plant is presented. The properties of glass fibers and metal-matrix composites are examined. Lunar geology is also discussed. A raw material and site are selected based on this information. A detailed plant design is presented, and summer experiments to be carried out at Johnson Space Center are reviewed.

  19. Making designer mutants in model organisms.

    PubMed

    Peng, Ying; Clark, Karl J; Campbell, Jarryd M; Panetta, Magdalena R; Guo, Yi; Ekker, Stephen C

    2014-11-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.

  20. The PIC [Process Individualization Curriculum] Model: Structure with Humanistic Goals.

    ERIC Educational Resources Information Center

    Gow, Doris T.

    This paper describes a curriculum design model to train research and development personnel under USOE-NIE funding. This design model, called PIC (Process Individualization Curriculum), was chosen for coverting on-campus courses to extra-mural self-instructional courses. The curriculum specialists who work with professors to individualize their…

  1. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  2. Study on Product Innovative Design Process Driven by Ideal Solution

    NASA Astrophysics Data System (ADS)

    Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui

    Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.

  3. Course Design Using an Authentic Studio Model

    ERIC Educational Resources Information Center

    Wilson, Jay R.

    2013-01-01

    Educational Technology and Design 879 is a graduate course that introduces students to the basics of video design and production. In an attempt to improve the learning experience for students a redesign of the course was implemented for the summer of 2011 that incorporated an authentic design studio model. The design studio approach is based on…

  4. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 2: Combustion of brown coal from the Kansk-Achinsk Basin in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-03-01

    This paper continues with the description of study results for an improved-design steam boiler vortex furnace, for the full-scale configuration of which the numerical modeling of a three-dimensional turbulent two-phase reacting flow has been performed with allowance for all the principal heat and mass transfer processes in the torch combustion of pulverized Berezovsk brown coal from the Kansk-Achinsk Basin. The detailed distributions of velocity, temperature, concentration, and heat flux fields in different cross sections of the improved vortex furnace have been obtained. The principal thermoengineering and environmental characteristics of this furnace are given.

  5. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  6. VCM Process Design: An ABET 2000 Fully Compliant Project

    ERIC Educational Resources Information Center

    Benyahia, Farid

    2005-01-01

    A long experience in undergraduate vinyl chloride monomer (VCM) process design projects is shared in this paper. The VCM process design is shown to be fully compliant with ABET 2000 criteria by virtue of its abundance in chemical engineering principles, integration of interpersonal and interdisciplinary skills in design, safety, economics, and…

  7. Learning Objects: A User-Centered Design Process

    ERIC Educational Resources Information Center

    Branon, Rovy F., III

    2011-01-01

    Design research systematically creates or improves processes, products, and programs through an iterative progression connecting practice and theory (Reinking, 2008; van den Akker, 2006). Developing a new instructional systems design (ISD) processes through design research is necessary when new technologies emerge that challenge existing practices…

  8. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  9. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  10. Understanding the Processes behind Student Designing: Cases from Singapore

    ERIC Educational Resources Information Center

    Lim, Susan Siok Hiang; Lim-Ratnam, Christina; Atencio, Matthew

    2013-01-01

    A common perception of designing is that it represents a highly complex activity that is manageable by only a few. However it has also been argued that all individuals are innately capable of designing. Taking up this latter view, we explored the processes behind student designing in the context of Design and Technology (D&T), a subject taught…

  11. Instructional Design Models: What a Revolution!

    ERIC Educational Resources Information Center

    Faryadi, Qais

    2007-01-01

    This review examines instructional design models and the construction of knowledge. It further explores to identify the chilling benefits of these models for the inputs and outputs of knowledge transfer. This assessment also attempts to define instructional design models through the eyes and the minds of renowned scholars as well as the most…

  12. Hotspot detection and design recommendation using silicon calibrated CMP model

    NASA Astrophysics Data System (ADS)

    Hui, Colin; Wang, Xian Bin; Huang, Haigou; Katakamsetty, Ushasree; Economikos, Laertis; Fayaz, Mohammed; Greco, Stephen; Hua, Xiang; Jayathi, Subramanian; Yuan, Chi-Min; Li, Song; Mehrotra, Vikas; Chen, Kuang Han; Gbondo-Tugbawa, Tamba; Smith, Taber

    2009-03-01

    Chemical Mechanical Polishing (CMP) has been used in the manufacturing process for copper (Cu) damascene process. It is well known that dishing and erosion occur during CMP process, and they strongly depend on metal density and line width. The inherent thickness and topography variations become an increasing concern for today's designs running through advanced process nodes (sub 65nm). Excessive thickness and topography variations can have major impacts on chip yield and performance; as such they need to be accounted for during the design stage. In this paper, we will demonstrate an accurate physics based CMP model and its application for CMP-related hotspot detection. Model based checking capability is most useful to identify highly environment sensitive layouts that are prone to early process window limitation and hence failure. Model based checking as opposed to rule based checking can identify more accurately the weak points in a design and enable designers to provide improved layout for the areas with highest leverage for manufacturability improvement. Further, CMP modeling has the ability to provide information on interlevel effects such as copper puddling from underlying topography that cannot be captured in Design-for- Manufacturing (DfM) recommended rules. The model has been calibrated against the silicon produced with the 45nm process from Common Platform (IBMChartered- Samsung) technology. It is one of the earliest 45nm CMP models available today. We will show that the CMP-related hotspots can often occur around the spaces between analog macros and digital blocks in the SoC designs. With the help of the CMP model-based prediction, the design, the dummy fill or the placement of the blocks can be modified to improve planarity and eliminate CMP-related hotspots. The CMP model can be used to pass design recommendations to designers to improve chip yield and performance.

  13. Processes and Knowledge in Designing Instruction.

    ERIC Educational Resources Information Center

    Greeno, James G.; And Others

    Results from a study of problem solving in the domain of instructional design are presented. Subjects were eight teacher trainees who were recent graduates of or were enrolled in the Stanford Teacher Education Program at Stanford University (California). Subjects studied a computer-based tutorial--the VST2000--about a fictitious vehicle. The…

  14. User-Centered Design (UCD) Process Description

    DTIC Science & Technology

    2014-12-01

    mockups and prototypes. CONCLUSIONS AND RECOMMENDATIONS UCD provides guidance for improving total system performance by considering the real- world...against essential story scenarios, eventually leading to the development of high-fidelity mockups and prototypes. Figure 1. User-centered design (UCD

  15. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  16. NASA Now: Engineering Design Process: Hubble Space Telescope

    NASA Video Gallery

    In this episode of NASA Now, NASA engineer Russ Werneth discusses the continuous nature of the engineering design process and shares what it was like to design and plan the spacewalks that were key...

  17. The Processes Involved in Designing Software.

    DTIC Science & Technology

    1980-08-01

    compiler’s task was to derive the sequence of IPL instructions that brought about that transformation. The other form of deflnillons was In terms of...used suitably generalized forms of means-ends analysis to generate sequences of IPL instructions that would meet the Input specifications. One branch...processes called critics are used to reorganize this more detailed plan Into an Internally consistent and efficient sequence of actions. The process

  18. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  19. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  20. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  1. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  2. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  3. The Process of Soviet Weapons Design

    DTIC Science & Technology

    1978-03-01

    system on the BMP from an early 1940s German design. But the validity and usefulness of a theory, especially one that makes predictions about the future...when the 1940 publication of a highly significant Soviet discovery of spontaneous fission resulted in a complete lack of an American response, the...taken from I. N. Golovin , I. V. Khurchatov, Atomizdat, Moscow, 1973, and from Herbert York, The Advisors. Oppenheimer, Teller, and the Superbomb, W. H

  4. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  5. The Computer-Aided Analytic Process Model. Operations Handbook for the APM (Analytic Process Model) Demonstration Package. Appendix

    DTIC Science & Technology

    1986-01-01

    The Analytic Process Model for System Design and Measurement: A Computer-Aided Tool for Analyzing Training Systems and Other Human-Machine Systems. A...separate companion volume--The Computer-Aided Analytic Process Model : Operations Handbook for the APM Demonstration Package is also available under

  6. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  7. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  8. Designing control system information models

    NASA Technical Reports Server (NTRS)

    Panin, K. I.; Zinchenko, V. P.

    1973-01-01

    Problems encountered in modeling information models are discussed, Data cover condition, functioning of the object of control, and the environment involved in the control. Other parameters needed for the model include: (1) information for forming an image of the real situation, (2) data for analyzing and evaluating an evolving situation, (3) planning actions, and (4) data for observing and evaluating the results of model realization.

  9. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    PubMed

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

  10. Design of Training Systems Utility Assessment. The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment

    DTIC Science & Technology

    1976-05-01

    TRAINING ANALYSIS AND EVALUATION GROUP TAEG REPORT NO. 33. DOTS UTILITY ASSESSMENT: The Training Process Flow anu System...purpose of the United States Government. ALFRED F. SMODE, Ph.D., Director Training Analysis & Evaluation Group TAEG REPORT NO. 33 FOREWORD The... SUMMARY 46 ill TAEG REPORT NO. 33 LIST OF FIGURES FIGURE NO. PAGE 1 DOTS SYSTEM DIAGRAM 2 FIELD TEST SCHEDULE lv TAEG REPORT NO. 33 LIST OF

  11. Lessons from nature: computational design of biomimetic compounds and processes.

    PubMed

    Bozkurt, Esra; Ashari, Negar; Browning, Nicholas; Brunk, Elizabeth; Campomanesa, Pablo; Perez, Marta A S; Rothlisberger, Ursula

    2014-09-01

    Through millions of years of evolution, Nature has accomplished the development of highly efficient and sustainable processes and the idea to understand and copy natural strategies is therefore very appealing. However, in spite of intense experimental and computational research, it has turned out to be a difficult task to design efficient biomimetic systems. Here we discuss a novel strategy for the computational design of biomimetic compounds and processes that consists of i) target selection; ii) atomistic and electronic characterization of the wild type system and the biomimetic compounds; iii) identification of key descriptors through feature selection iv) choice of biomimetic template and v) efficient search of chemical and sequence space for optimization of the biomimetic system. As a proof-of-principles study, this general approach is illustrated for the computational design of a 'green' catalyst mimicking the action of the zinc metalloenzyme Human Carbonic Anhydrase (HCA). HCA is a natural model for CO2 fixation since the enzyme is able to convert CO2 into bicarbonate. Very recently, a weakly active HCA mimic based on a trihelical peptide bundle was synthetized. We have used quantum mechanical/molecular mechanical (QM/MM) Car-Parrinello simulations to study the mechanisms of action of HCA and its peptidic mimic and employed the obtained information to guide the design of improved biomimetic analogues. Applying a genetic algorithm based optimization procedure, we were able to re-engineer and optimize the biomimetic system towards its natural counter part. In a second example, we discuss a similar strategy for the design of biomimetic sensitizers for use in dye-sensitized solar cells.

  12. Computer modeling of complete IC fabrication process

    NASA Astrophysics Data System (ADS)

    Dutton, Robert W.

    1987-05-01

    The development of fundamental algorithms for process and device modeling as well as novel integration of the tools for advanced Integrated Circuit (IC) technology design is discussed. The development of the first complete 2D process simulator, SUPREM 4, is reported. The algorithms are discussed as well as application to local-oxidation and extrinsic diffusion conditions which occur in CMOS AND BiCMOS technologies. The evolution of 1D (SEDAN) and 2D (PISCES) device analysis is discussed. The application of SEDAN to a variety of non-silicon technologies (GaAs and HgCdTe) are considered. A new multi-window analysis capability for PISCES which exploits Monte Carlo analysis of hot carriers has been demonstrated and used to characterize a variety of silicon MOSFET and GaAs MESFET effects. A parallel computer implementation of PISCES has been achieved using a Hypercube architecture. The PISCES program has been used for a range of important device studies including: latchup, analog switch analysis, MOSFET capacitance studies and bipolar transient device for ECL gates. The program is broadly applicable to RAM and BiCMOS technology analysis and design. In the analog switch technology area this research effort has produced a variety of important modeling and advances.

  13. Adding Users to the Website Design Process

    ERIC Educational Resources Information Center

    Tomeo, Megan L.

    2012-01-01

    Alden Library began redesigning its website over a year ago. Throughout the redesign process the students, faculty, and staff that make up the user base were added to the conversation by utilizing several usability test methods. This article focuses on the usability testing conducted at Alden Library and delves into future usability testing, which…

  14. Thermal Modeling of A Friction Bonding Process

    SciTech Connect

    John Dixon; Douglas Burkes; Pavel Medvedev

    2007-10-01

    A COMSOL model capable of predicting temperature evolution during nuclear fuel fabrication is being developed at the Idaho National Laboratory (INL). Fuel plates are fabricated by friction bonding (FB) uranium-molybdenum (U-Mo) alloy foils positioned between two aluminum plates. The ability to predict temperature distribution during fabrication is imperative to ensure good quality bonding without inducing an undesirable chemical reaction between U-Mo and aluminum. A three-dimensional heat transfer model of the FB process implementing shallow pin penetration for cladding monolithic nuclear fuel foils is presented. Temperature distribution during the FB process as a function of fabrication parameters such as weld speed, tool load, and tool rotational frequency are predicted. Model assumptions, settings, and equations are described in relation to standard friction stir welding. Current experimental design for validation and calibration of the model is also demonstrated. Resulting experimental data reveal the accuracy in describing asymmetrical temperature distributions about the tool face. Temperature of the bonded plate drops beneath the pin and is higher on the advancing side than the retreating side of the tool.

  15. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES WITH FUGITIVE AND OPEN EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the economics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. ...

  16. Integrating ergonomics in design processes: a case study within an engineering consultancy firm.

    PubMed

    Sørensen, Lene Bjerg; Broberg, Ole

    2012-01-01

    This paper reports on a case study within an engineering consultancy firm, where engineering designers and ergonomists were working together on the design of a new hospital sterile processing plant. The objective of the paper is to gain a better understanding of the premises for integrating ergonomics into engineering design processes and how different factors either promote or limit the integration. Based on a grounded theory approach a model illustrating these factors is developed and different hypotheses about how these factors either promote and/or limit the integration of ergonomics into design processes is presented along with the model.

  17. Experimental design for dynamics identification of cellular processes.

    PubMed

    Dinh, Vu; Rundell, Ann E; Buzzard, Gregery T

    2014-03-01

    We address the problem of using nonlinear models to design experiments to characterize the dynamics of cellular processes by using the approach of the Maximally Informative Next Experiment (MINE), which was introduced in W. Dong et al. (PLoS ONE 3(8):e3105, 2008) and independently in M.M. Donahue et al. (IET Syst. Biol. 4:249-262, 2010). In this approach, existing data is used to define a probability distribution on the parameters; the next measurement point is the one that yields the largest model output variance with this distribution. Building upon this approach, we introduce the Expected Dynamics Estimator (EDE), which is the expected value using this distribution of the output as a function of time. We prove the consistency of this estimator (uniform convergence to true dynamics) even when the chosen experiments cluster in a finite set of points. We extend this proof of consistency to various practical assumptions on noisy data and moderate levels of model mismatch. Through the derivation and proof, we develop a relaxed version of MINE that is more computationally tractable and robust than the original formulation. The results are illustrated with numerical examples on two nonlinear ordinary differential equation models of biomolecular and cellular processes.

  18. In-design process hotspot repair using pattern matching

    NASA Astrophysics Data System (ADS)

    Jang, Daehyun; Ha, Naya; Jeon, Junsu; Kang, Jae-Hyun; Paek, Seung Weon; Choi, Hungbok; Kim, Kee Sup; Lai, Ya-Chieh; Hurat, Philippe; Luo, Wilbur

    2012-03-01

    As patterning for advanced processes becomes more challenging, designs must become more process-aware. The conventional approach of running lithography simulation on designs to detect process hotspots is prohibitive in terms of runtime for designers, and also requires the release of highly confidential process information. Therefore, a more practical approach is required to make the In-Design process-aware methodology more affordable in terms of maintenance, confidentiality, and runtime. In this study, a pattern-based approach is chosen for Process Hotspot Repair (PHR) because it accurately captures the manufacturability challenges without releasing sensitive process information. Moreover, the pattern-based approach is fast and well integrated in the design flow. Further, this type of approach is very easy to maintain and extend. Once a new process weak pattern has been discovered (caused by Chemical Mechanical Polishing (CMP), etch, lithography, and other process steps), the pattern library can be quickly and easily updated and released to check and fix subsequent designs. This paper presents the pattern matching flow and discusses its advantages. It explains how a pattern library is created from the process weak patterns found on silicon wafers. The paper also discusses the PHR flow that fixes process hotspots in a design, specifically through the use of pattern matching and routing repair.

  19. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  20. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  1. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  2. Ceramic processing: Experimental design and optimization

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.; Lauben, David N.; Madrid, Philip

    1992-01-01

    The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.

  3. Mathematical modelling in the computer-aided process planning

    NASA Astrophysics Data System (ADS)

    Mitin, S.; Bochkarev, P.

    2016-04-01

    This paper presents new approaches to organization of manufacturing preparation and mathematical models related to development of the computer-aided multi product process planning (CAMPP) system. CAMPP system has some peculiarities compared to the existing computer-aided process planning (CAPP) systems: fully formalized developing of the machining operations; a capacity to create and to formalize the interrelationships among design, process planning and process implementation; procedures for consideration of the real manufacturing conditions. The paper describes the structure of the CAMPP system and shows the mathematical models and methods to formalize the design procedures.

  4. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  5. H-Coal process and plant design

    DOEpatents

    Kydd, Paul H.; Chervenak, Michael C.; DeVaux, George R.

    1983-01-01

    A process for converting coal and other hydrocarbonaceous materials into useful and more valuable liquid products. The process comprises: feeding coal and/or other hydrocarbonaceous materials with a hydrogen-containing gas into an ebullated catalyst bed reactor; passing the reaction products from the reactor to a hot separator where the vaporous and distillate products are separated from the residuals; introducing the vaporous and distillate products from the separator directly into a hydrotreater where they are further hydrogenated; passing the residuals from the separator successively through flash vessels at reduced pressures where distillates are flashed off and combined with the vaporous and distillate products to be hydrogenated; transferring the unseparated residuals to a solids concentrating and removal means to remove a substantial portion of solids therefrom and recycling the remaining residual oil to the reactor; and passing the hydrogenated vaporous and distillate products to an atmospheric fractionator where the combined products are fractionated into separate valuable liquid products. The hydrogen-containing gas is generated from sources within the process.

  6. Organizational Learning and Product Design Management: Towards a Theoretical Model.

    ERIC Educational Resources Information Center

    Chiva-Gomez, Ricardo; Camison-Zornoza, Cesar; Lapiedra-Alcami, Rafael

    2003-01-01

    Case studies of four Spanish ceramics companies were used to construct a theoretical model of 14 factors essential to organizational learning. One set of factors is related to the conceptual-analytical phase of the product design process and the other to the creative-technical phase. All factors contributed to efficient product design management…

  7. Learning Design Based on Graphical Knowledge-Modelling

    ERIC Educational Resources Information Center

    Paquette, Gilbert; Leonard, Michel; Lundgren-Cayrol, Karin; Mihaila, Stefan; Gareau, Denis

    2006-01-01

    This chapter states and explains that a Learning Design is the result of a knowledge engineering process where knowledge and competencies, learning design and delivery models are constructed in an integrated framework. We present a general graphical language and a knowledge editor that has been adapted to support the construction of learning…

  8. Optimal Designs for the Rasch Model

    ERIC Educational Resources Information Center

    Grasshoff, Ulrike; Holling, Heinz; Schwabe, Rainer

    2012-01-01

    In this paper, optimal designs will be derived for estimating the ability parameters of the Rasch model when difficulty parameters are known. It is well established that a design is locally D-optimal if the ability and difficulty coincide. But locally optimal designs require that the ability parameters to be estimated are known. To attenuate this…

  9. The Use of Executive Control Processes in Engineering Design by Engineering Students and Professional Engineers

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Johnson, Scott D.

    2012-01-01

    A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…

  10. Bioreactor and process design for biohydrogen production.

    PubMed

    Show, Kuan-Yeow; Lee, Duu-Jong; Chang, Jo-Shu

    2011-09-01

    Biohydrogen is regarded as an attractive future clean energy carrier due to its high energy content and environmental-friendly conversion. It has the potential for renewable biofuel to replace current hydrogen production which rely heavily on fossil fuels. While biohydrogen production is still in the early stage of development, there have been a variety of laboratory- and pilot-scale systems developed with promising potential. This work presents a review of advances in bioreactor and bioprocess design for biohydrogen production. The state-of-the art of biohydrogen production is discussed emphasizing on production pathways, factors affecting biohydrogen production, as well as bioreactor configuration and operation. Challenges and prospects of biohydrogen production are also outlined.

  11. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  12. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  13. Computer Modeling and Visualization in Design Technology: An Instructional Model.

    ERIC Educational Resources Information Center

    Guidera, Stan

    2002-01-01

    Design visualization can increase awareness of issues related to perceptual and psychological aspects of design that computer-assisted design and computer modeling may not allow. A pilot university course developed core skills in modeling and simulation using visualization. Students were consistently able to meet course objectives. (Contains 16…

  14. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  15. Numerical modelling of circulation and dispersion processes in Boulogne-sur-Mer harbour (Eastern English Channel): sensitivity to physical forcing and harbour design

    NASA Astrophysics Data System (ADS)

    Jouanneau, Nicolas; Sentchev, Alexei; Dumas, Franck

    2013-12-01

    The MARS-3D model in conjunction with the particle tracking module Ichthyop is used to study circulation and tracer dynamics under a variety of forcing conditions in the eastern English Channel, and in the Boulogne-sur-Mer harbour (referred to hereafter as BLH). Results of hydrodynamic modelling are validated against the tidal gauge data, VHF radar surface velocities and ADCP measurements. Lagrangian tracking experiments are performed with passive particles to study tracer dispersal along the northern French coast, with special emphasis on the BLH. Simulations revealed an anticyclonic eddy generated in the harbour at rising tide. Tracers, released during flood tide at the Liane river mouth, move northward with powerful clockwise rotating current. After the high water, the current direction changes to westward, and tracers leave the harbour through the open boundary. During ebb tide, currents convergence along the western open boundary but no eddy is formed, surface currents inside the harbour are much weaker and the tracer excursion length is small. After the current reversal at low water, particles are advected shoreward resulting in a significant increase of the residence time of tracers released during ebb tide. The effect of wind on particle dispersion was found to be particularly strong. Under strong SW wind, the residence time of particles released during flood tide increases from 1.5 to 6 days. For release during ebb tide, SW wind weakens the southward tidally induced drift and thus the residence time decreases. Similar effects are observed when the freshwater inflow to the harbour is increased from 2 to 10 m3/s during the ebb tide flow. For flood tide conditions, the effect of freshwater inflow is less significant. We also demonstrate an example of innovative coastal management targeted at the reduction of the residence time of the pathogenic material accidentally released in the harbour.

  16. Debating Professional Designations for Evaluators: Reflections on the Canadian Process

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Cullen, Jim; Malik, Sumbal; Maicher, Brigitte

    2009-01-01

    This paper provides a reflective account of a consultation process on professional designations for evaluators initiated and coordinated by the Canadian Evaluation Society (CES). Described are: (1) the forces leading CES to generate discussion and debate about professional designations for Canadian evaluators, (2) the process of developing and…

  17. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  18. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  19. Programming-Free Form Conversion, Design, and Processing

    PubMed Central

    Fan, Ting-Jun; Machlin, Rona S.; Wang, Christopher P.; Chang, Ifay F.

    1990-01-01

    In this paper, we present the requirements and design considerations for programming-free form conversion, design, and processing. A set of object-oriented software tools are also presented to help users convert a paper form into an electronic form, design an electronic form, and fill in an electronic form directly on screen.

  20. The Use of Computer Graphics in the Design Process.

    ERIC Educational Resources Information Center

    Palazzi, Maria

    This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…

  1. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  2. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  3. Modeling of an Active Tablet Coating Process.

    PubMed

    Toschkoff, Gregor; Just, Sarah; Knop, Klaus; Kleinebudde, Peter; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes G

    2015-12-01

    Tablet coating is a common unit operation in the pharmaceutical industry, during which a coating layer is applied to tablet cores. The coating uniformity of tablets in a batch is especially critical for active coating, that is, coating that contains an active pharmaceutical ingredient. In recent years, discrete element method (DEM) simulations became increasingly common for investigating tablet coating. In this work, DEM was applied to model an active coating process as closely as possible, using measured model parameters and non-spherical particles. We studied how operational conditions (rotation speed, fill level, number of nozzles, and spray rate) influence the coating uniformity. To this end, simulation runs were planned and interpreted according to a statistical design of (simulation) experiments. Our general goal was to achieve a deeper understanding of the process in terms of residence times and dimensionless scaling laws. With that regard, the results were interpreted in light of analytical models. The results were presented at various detail levels, ranging from an overview of all variations to in-depth considerations. It was determined that the biggest uniformity improvement in a realistic setting was achieved by increasing the number of spray nozzles, followed by increasing the rotation speed and decreasing the fill level.

  4. An Inorganic Microsphere Composite for the Selective Removal of 137 Cesium from Acidic Nuclear Waste Solutions 2: Bench-Scale Column Experiments, Modeling, and Preliminary Process Design

    SciTech Connect

    Troy J. Tranter; T. A. Vereschagina; V. Utgikar

    2009-03-01

    A new inorganic ion exchange composite for removing radioactive cesium from acidic waste streams has been developed. The new material consists of ammonium molybdophosphate, (NH4)3P(Mo3O10)4?3H2O (AMP), synthesized within hollow aluminosilicate microspheres (AMP-C), which are produced as a by-product from coal combustion. The selective cesium exchange capacity of this inorganic composite was evaluated in bench-scale column tests using simulated sodium bearing waste solution as a surrogate for the acidic tank waste currently stored at the Idaho National Laboratory (INL). Total cesium loading on the columns at saturation agreed very well with equilibrium values predicted from isotherm experiments performed previously. A numerical algorithm for solving the governing partial differential equations (PDE) for cesium uptake was developed using the intraparticle mass transfer coefficient obtained from previous batch kinetic experiments. Solutions to the governing equations were generated to obtain the cesium concentration at the column effluent as a function of throughput volume using the same conditions as those used for the actual column experiments. The numerical solutions of the PDE fit the column break through data quite well for all the experimental conditions in the study. The model should therefore provide a reliable prediction of column performance at larger scales.

  5. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  6. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E; Cavusoglu, M

    2010-01-01

    Modeling and simulation of physiological processes deal with the challenges of multiscale models in which coupling is very high within and among scales. Information technology approaches together with related analytical and computational tools will help to deal with these challenges. Physiological Model Simulation, Integration and Modeling Framework, Phy-SIM, provides the modeling environment which will help to cultivate various approaches to deal with the inherent problem of multiscale modeling of physiological systems. In this paper, we present the modular design of Phy-SIM. The proposed layered design of Phy-SIM, separates structure from function in physiological processes advocating modular thinking in developing and integrating physiological models. Moreover, the ontology based architecture will improve the modeling process by the mechanisms to attach anatomical and physiological ontological information to the models. The ultimate aim of the proposed approaches is to enhance the physiological model development and integration processes by providing the tools and mechanisms in Phy-SIM.

  7. Using Storyboards to Integrate Models and Informal Design Knowledge

    NASA Astrophysics Data System (ADS)

    Haesen, Mieke; van den Bergh, Jan; Meskens, Jan; Luyten, Kris; Degrandsart, Sylvain; Demeyer, Serge; Coninx, Karin

    Model-driven development of user interfaces has become increasingly powerful in recent years. Unfortunately, model-driven approaches have the inherent limitation that they cannot handle the informal nature of some of the artifacts used in truly multidisciplinary user interface development such as storyboards, sketches, scenarios and personas. In this chapter, we present an approach and tool support for multidisciplinary user interface development bridging informal and formal artifacts in the design and development process. Key features of the approach are the usage of annotated storyboards, which can be connected to other models through an underlying meta-model, and cross-toolkit design support based on an abstract user interface model.

  8. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  9. Laser processing with specially designed laser beam

    NASA Astrophysics Data System (ADS)

    Asratyan, A. A.; Bulychev, N. A.; Feofanov, I. N.; Kazaryan, M. A.; Krasovskii, V. I.; Lyabin, N. A.; Pogosyan, L. A.; Sachkov, V. I.; Zakharyan, R. A.

    2016-04-01

    The possibility of using laser systems to form beams with special spatial configurations has been studied. The laser systems applied had a self-conjugate cavity based on the elements of copper vapor lasers (LT-5Cu, LT-10Cu, LT-30Cu) with an average power of 5, 10, or 30 W. The active elements were pumped by current pulses of duration 80-100 ns. The duration of laser generation pulses was up to 25 ns. The generator unit included an unstable cavity, where one reflector was a special mirror with a reflecting coating. Various original optical schemes used were capable of exploring spatial configurations and energy characteristics of output laser beams in their interaction with micro- and nanoparticles fabricated from various materials. In these experiments, the beam dimensions of the obtained zones varied from 0.3 to 5 µm, which is comparable with the minimum permissible dimensions determined by the optical elements applied. This method is useful in transforming a large amount of information at the laser pulse repetition rate of 10-30 kHz. It was possible to realize the high-precision micromachining and microfabrication of microscale details by direct writing, cutting and drilling (with the cutting width and through-hole diameters ranging from 3 to 100 µm) and produce microscale, deep, intricate and narrow grooves on substrate surfaces of metals and nonmetal materials. This system is used for producing high-quality microscale details without moving the object under treatment. It can also be used for microcutting and microdrilling in a variety of metals such as molybdenum, copper and stainless steel, with a thickness of up to 300 µm, and in nonmetals such as silicon, sapphire and diamond with a thickness ranging from 10 µm to 1 mm with different thermal parameters and specially designed laser beam.

  10. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  11. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  12. Xylose fermentation: Analysis, modelling, and design

    SciTech Connect

    Slininger, P.J.W.

    1988-01-01

    Ethanolic fermentation is a means of utilizing xylose-rich industrial wastes, but an optimized bioprocess is lacking. Pachysolen tannophilus NRRL Y-7124 was the first yeast discovered capable of significant ethanol production from xylose and has served as a model for studies of other yeasts mediating this conversion. However, a comparative evaluation of strains led the authors to focus on Pichia stipitis NRRL Y-7124 as the yeast with highest potential for application. Given 150 g/l xylose in complex medium, strain Y-7124 functioned optimally at 25-26C pH 4-7 to accumulate 56 g/l ethanol with negligible xylitol production. Dissolved oxygen concentration was critical to cell growth; and in order to measure it accurately, a colorimetric assay was developed to allow calibration of electrodes based on oxygen solubility in media of varying composition. Specific growth rate was a Monod function of limiting substrate concentration (oxygen and/or xylose). Both specific ethanol productivity and oxygen uptake rate were growth-associated, but only the former was maintenance-associated. Both growth and fermentation were inhibited by high xylose and ethanol concentrations. Carbon and cofactor balances supported modelling xylose metabolism as a combination of four processes: assimilation, pentose phosphate oxidation, respiration, and ethanolic fermentation. A mathematical model describing the stoichiometry and kinetics was constructed, and its predictive capacity was confirmed by comparing simulated and experimental batch cultures. Consideration of example processes indicated that this model constitutes an important tool for designing the optimum bioprocess for utilizing xylose-rich wastes.

  13. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  14. Perspectives on the design of safer nanomaterials and manufacturing processes

    PubMed Central

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-01-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles that includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial. PMID:26435688

  15. Perspectives on the design of safer nanomaterials and manufacturing processes

    NASA Astrophysics Data System (ADS)

    Geraci, Charles; Heidel, Donna; Sayes, Christie; Hodson, Laura; Schulte, Paul; Eastlake, Adrienne; Brenner, Sara

    2015-09-01

    A concerted effort is being made to insert Prevention through Design principles into discussions of sustainability, occupational safety and health, and green chemistry related to nanotechnology. Prevention through Design is a set of principles, which includes solutions to design out potential hazards in nanomanufacturing including the design of nanomaterials, and strategies to eliminate exposures and minimize risks that may be related to the manufacturing processes and equipment at various stages of the lifecycle of an engineered nanomaterial.

  16. Design and Implementation of a Multimedia DBMS: Complex Query Processing

    DTIC Science & Technology

    1991-09-01

    IMPLEMENTATION OF A MULTIMEDIA DBMS: COMPLEX QUERY PROCESSING by Huseyin Aygun September 1991 Thesis Advisor Vincent Y. Lum Approved for public release...type "trace in <function name>. 31 IV. DESIGN OF COMPLEX QUERY PROCESSING In Chapter II of this thesis the general architecture of the MDBMS...data to display. More detailed information about the modification can be found in the next chapter of this thesis . Because the design for the process

  17. Model Design for Military Advisors

    DTIC Science & Technology

    2013-05-02

    are highly prized , the Stoic Acquaintance model was of poor choice.10 Approach to Counterpart’s Development: In any advising role, it is crucial...dilemma that could lead to non-specific result, superficiality, and failure. Practically, these advisors can fall into the category of “eye candy

  18. Cognitive Design for Learning: Cognition and Emotion in the Design Process

    ERIC Educational Resources Information Center

    Hasebrook, Joachim

    2016-01-01

    We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…

  19. Freshman Interest Groups: Designing a Model for Success

    ERIC Educational Resources Information Center

    Ratliff, Gerald Lee

    2008-01-01

    Freshman Interest Groups (FIGS) have become a popular model for academic and student affairs colleagues who are concerned that first-year students learn to reflect on life experiences and daily events as part of the learning process. A well-designed FIG model meets the academic, social and career concerns for first-year students by providing an…

  20. GREENER CHEMICAL PROCESS DESIGN ALTERNATIVES ARE REVEALED USING THE WASTE REDUCTION DECISION SUPPORT SYSTEM (WAR DSS)

    EPA Science Inventory

    The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...

  1. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  2. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  3. Cascaded process model based control: packed absorption column application.

    PubMed

    Govindarajan, Anand; Jayaraman, Suresh Kumar; Sethuraman, Vijayalakshmi; Raul, Pramod R; Rhinehart, R Russell

    2014-03-01

    Nonlinear, adaptive, process-model based control is demonstrated in a cascaded single-input-single-output mode for pressure drop control in a pilot-scale packed absorption column. The process is shown to be nonlinear. Control is demonstrated in both servo and regulatory modes, for no wind-up in a constrained situation, and for bumpless transfer. Model adaptation is demonstrated and shown to provide process insight. The application procedure is revealed as a design guide to aid others in implementing process-model based control.

  4. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs.

  5. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  6. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  7. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  8. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  9. Materials, design and processing of air encapsulated MEMS packaging

    NASA Astrophysics Data System (ADS)

    Fritz, Nathan T.

    integrity. The development of mechanical models complimented the experimental studies. A model of the overcoat materials used the film properties and elastic deformations to study the stress-strain behavior of the suspended dielectric films under external forces. The experimental molding tests and mechanical models were used to establish processing conditions and physical designs for the cavities as a function of cavity size. A novel, metal-free chip package was investigated combining the in-situ thermal decomposition of the sacrificial material during post-mold curing of the lead frame molding compound. Sacrificial materials were characterized for their degree of decomposition during the molding cure to provide a chip package with improved mechanical support and no size restrictions. Improvements to the air cavities for MEMS packaging led to investigations and refinements of other microfabrication processes. The sacrificial polycarbonate materials were shown to be useful as temporary bonding materials for wafer-level bonding. The release temperature and conditions of the processed wafer can be changed based on the polycarbonates formulation. The electroless deposition of metal was investigated as an alternative process for metalizing the air cavities. The deposition of silver and copper using a Sn/Ag catalyst as a replacement for costly palladium activation was demonstrated. The electroless deposition was tested on polymer and silicon dioxide surfaces for organic boards and through-silicon vias.

  10. The Computer-Aided Analytic Process Model. Operations Handbook for the Analytic Process Model Demonstration Package

    DTIC Science & Technology

    1986-01-01

    Research Note 86-06 THE COMPUTER-AIDED ANALYTIC PROCESS MODEL : OPERATIONS HANDBOOK FOR THE ANALYTIC PROCESS MODEL DE ONSTRATION PACKAGE Ronald G...ic Process Model ; Operations Handbook; Tutorial; Apple; Systems Taxonomy Mod--l; Training System; Bradl1ey infantry Fighting * Vehicle; BIFV...8217. . . . . . . .. . . . . . . . . . . . . . . . * - ~ . - - * m- .. . . . . . . item 20. Abstract -continued companion volume-- "The Analytic Process Model for

  11. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  12. Design and development of a layer-based additive manufacturing process for the realization of metal parts of designed mesostructure

    NASA Astrophysics Data System (ADS)

    Williams, Christopher Bryant

    Low-density cellular materials, metallic bodies with gaseous voids, are a unique class of materials that are characterized by their high strength, low mass, good energy absorption characteristics, and good thermal and acoustic insulation properties. In an effort to take advantage of this entire suite of positive mechanical traits, designers are tailoring the cellular mesostructure for multiple design objectives. Unfortunately, existing cellular material manufacturing technologies limit the design space as they are limited to certain part mesostructure, material type, and macrostructure. The opportunity that exists to improve the design of existing products, and the ability to reap the benefits of cellular materials in new applications is the driving force behind this research. As such, the primary research goal of this work is to design, embody, and analyze a manufacturing process that provides a designer the ability to specify the material type, material composition, void morphology, and mesostructure topology for any conceivable part geometry. The accomplishment of this goal is achieved in three phases of research: (1) Design---Following a systematic design process and a rigorous selection exercise, a layer-based additive manufacturing process is designed that is capable of meeting the unique requirements of fabricating cellular material geometry. Specifically, metal parts of designed mesostructure are fabricated via three-dimensional printing of metal oxide ceramic powder followed by post-processing in a reducing atmosphere. (2) Embodiment ---The primary research hypothesis is verified through the use of the designed manufacturing process chain to successfully realize metal parts of designed mesostructure. (3) Modeling & Evaluation ---The designed manufacturing process is modeled in this final research phase so as to increase understanding of experimental results and to establish a foundation for future analytical modeling research. In addition to an analysis of

  13. Design of a tomato packing system by image processing and optimization processing

    NASA Astrophysics Data System (ADS)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  14. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  15. On the optimal design of the disassembly and recovery processes

    SciTech Connect

    Xanthopoulos, A.; Iakovou, E.

    2009-05-15

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  16. On the optimal design of the disassembly and recovery processes.

    PubMed

    Xanthopoulos, A; Iakovou, E

    2009-05-01

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  17. Work system design for patient safety: the SEIPS model

    PubMed Central

    Carayon, P; Hundt, A Schoofs; Karsh, B‐T; Gurses, A P; Alvarado, C J; Smith, M; Brennan, P Flatley

    2006-01-01

    Models and methods of work system design need to be developed and implemented to advance research in and design for patient safety. In this paper we describe how the Systems Engineering Initiative for Patient Safety (SEIPS) model of work system and patient safety, which provides a framework for understanding the structures, processes and outcomes in health care and their relationships, can be used toward these ends. An application of the SEIPS model in one particular care setting (outpatient surgery) is presented and other practical and research applications of the model are described. PMID:17142610

  18. Modeling of thermal processes in waveguide tracts induction soldering

    NASA Astrophysics Data System (ADS)

    Murygin, A. V.; Tynchenko, V. S.; Laptenok, V. D.; Emilova, O. A.; Seregin, Yu N.

    2017-02-01

    The problem solving of the induction heating models development, which describe the heating of the separate structural assembly components of the waveguide path and product generally, is presented in this paper. Proposed mathematical models are based on the thermodynamics equation and on the heat balance law. The system of the heating process mathematical models, such as surge tube and flange heating, and the mathematical model of the energy distribution are presented. During the modeling process with Matlab system by using mathematical models graphs of the tube, flange and coupling heating were obtained. These design charts are confirmed by the results of the experimental study. During the experimental studies pyrometers for temperature control and a video camera for visual control of the process parameters were used. On the basis of obtained models the induction soldering process features analysis is carried out and the need of its automation by the using of the information control systems for thermal management between the connection elements is revealed.

  19. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  20. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  1. Ada COCOMO and the Ada Process Model

    DTIC Science & Technology

    1989-01-01

    language, the use of incremental development, and the use of the Ada process model capitalizing on the strengths of Ada to improve the efficiency of software...development. This paper presents the portions of the revised Ada COCOMO dealing with the effects of Ada and the Ada process model . The remainder of...this section of the paper discusses the objectives of Ada COCOMO. Section 2 describes the Ada Process Model and its overall effects on software

  2. Computational Model Optimization for Enzyme Design Applications

    DTIC Science & Technology

    2007-11-02

    naturally occurring E. coli chorismate mutase (EcCM) enzyme through computational design. Although the stated milestone of creating a novel... chorismate mutase (CM) was not achieved, the enhancement of the underlying computational model through the development of the two-body PB method will facilitate the future design of novel protein catalysts.

  3. Models for Turbulent Transport Processes.

    ERIC Educational Resources Information Center

    Hill, James C.

    1979-01-01

    Since the statistical theories of turbulence that have developed over the last twenty or thirty years are too abstract and unreliable to be of much use to chemical engineers, this paper introduces the techniques of single point models and suggests some areas of needed research. (BB)

  4. Reload design process at Yankee Atomic Electric Company

    SciTech Connect

    Weader, R.J.

    1986-01-01

    Yankee Atomic Electric Company (YAEC) performs reload design and licensing for their nuclear power plants: Yankee Rowe, Maine Yankee, and Vermont Yankee. Significant savings in labor and computer costs have been achieved in the reload design process by the use of the SIMULATE nodal code using the CASMO assembly burnup code or LEOPARD pin cell burnup code inputs to replace the PDQ diffusion theory code in many required calculations for the Yankee Rowe and Maine Yankee pressurized water reactors (PWRs). An efficient process has evolved for the design of reloads for the Vermont Yankee boiling water reactor (BWR). Due to the major differences in the core design of the three plants, different reload design processes have evolved for each plant.

  5. Innovation Process Design: A Change Management and Innovation Dimension Perspective

    NASA Astrophysics Data System (ADS)

    Peisl, Thomas; Reger, Veronika; Schmied, Juergen

    The authors propose an innovative approach to the management of innovation integrating business, process, and maturity dimensions. Core element of the concept is the adaptation of ISO/IEC 15504 to the innovation process including 14 innovation drivers. Two managerial models are applied to conceptualize and visualize the respective innovation strategies, the Balanced Scorecard and a Barriers in Change Processes Model. An illustrative case study shows a practical implementation process.

  6. Designers Workbench: Towards Real-Time Immersive Modeling

    SciTech Connect

    Kuester, F; Duchaineau, M A; Hamann, B; Joy, K I; Ma, K L

    2001-10-03

    This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technology or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  7. A verification and validation process for model-driven engineering

    NASA Astrophysics Data System (ADS)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  8. Using GREENSCOPE for Sustainable Process Design: An Educational Opportunity

    EPA Science Inventory

    Increasing sustainability can be approached through the education of those who design, construct, and operate facilities. As chemical engineers learn elements of process systems engineering, they can be introduced to sustainability concepts. The EPA’s GREENSCOPE methodology and...

  9. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  10. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  11. Designing School Accountability Systems: Towards a Framework and Process.

    ERIC Educational Resources Information Center

    Gong, Brian

    This document presents three different views of accountability to address state needs as their departments of education design, improve, or review their state accountability and reporting systems. The first of three sections presents the system-design decision process as a linear sequence of ten steps from defining the purposes of the…

  12. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  13. Relating Right Brain Studies to the Design Process.

    ERIC Educational Resources Information Center

    Hofland, John

    Intended for teachers of theatrical design who need to describe a design process for their students, this paper begins by giving a brief overview of recent research that has described the different functions of the right and left cerebral hemispheres. It then notes that although the left hemisphere tends to dominate the right hemisphere, it is the…

  14. Applying the ID Process to the Guided Design Teaching Strategy.

    ERIC Educational Resources Information Center

    Coscarelli, William C.; White, Gregory P.

    1982-01-01

    Describes the application of the instructional development process to a teaching technique called Guided Design in a Production-Operations Management course. In Guided Design, students are self-instructed in course content and use class time to apply this knowledge to self-instruction; in-class problem-solving is stressed. (JJD)

  15. PROCESS DESIGN MANUAL: LAND TREATMENT OF MUNICIPAL WASTEWATER

    EPA Science Inventory

    The manual presents a rational procedure for the design of land treatment systems. Slow rate, rapid infiltration, and overland flow processes for the treatment of municipal wastewaters are discussed in detail, and the design concepts and criteria are presented. A two-phased plann...

  16. Process Design Report for Stover Feedstock: Lignocellulosic Biomass to Ethanol Process Design and Economics Utilizing Co-Current Dilute Acid Prehydrolysis and Enzymatic Hydrolysis for Corn Stover

    SciTech Connect

    Aden, A.; Ruth, M.; Ibsen, K.; Jechura, J.; Neeves, K.; Sheehan, J.; Wallace, B.; Montague, L.; Slayton, A.; Lukas, J.

    2002-06-01

    The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update of the ongoing process design and economic analyses at NREL.

  17. Designers' models of the human-computer interface

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Breedin, Sarah D.

    1993-01-01

    Understanding design models of the human-computer interface (HCI) may produce two types of benefits. First, interface development often requires input from two different types of experts: human factors specialists and software developers. Given the differences in their backgrounds and roles, human factors specialists and software developers may have different cognitive models of the HCI. Yet, they have to communicate about the interface as part of the design process. If they have different models, their interactions are likely to involve a certain amount of miscommunication. Second, the design process in general is likely to be guided by designers' cognitive models of the HCI, as well as by their knowledge of the user, tasks, and system. Designers do not start with a blank slate; rather they begin with a general model of the object they are designing. The author's approach to a design model of the HCI was to have three groups make judgments of categorical similarity about the components of an interface: human factors specialists with HCI design experience, software developers with HCI design experience, and a baseline group of computer users with no experience in HCI design. The components of the user interface included both display components such as windows, text, and graphics, and user interaction concepts, such as command language, editing, and help. The judgments of the three groups were analyzed using hierarchical cluster analysis and Pathfinder. These methods indicated, respectively, how the groups categorized the concepts, and network representations of the concepts for each group. The Pathfinder analysis provides greater information about local, pairwise relations among concepts, whereas the cluster analysis shows global, categorical relations to a greater extent.

  18. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1993-01-01

    Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.

  19. Model-based risk analysis of coupled process steps.

    PubMed

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  20. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    SciTech Connect

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  1. Towards Semantic Modelling of Business Processes for Networked Enterprises

    NASA Astrophysics Data System (ADS)

    Furdík, Karol; Mach, Marián; Sabol, Tomáš

    The paper presents an approach to the semantic modelling and annotation of business processes and information resources, as it was designed within the FP7 ICT EU project SPIKE to support creation and maintenance of short-term business alliances and networked enterprises. A methodology for the development of the resource ontology, as a shareable knowledge model for semantic description of business processes, is proposed. Systematically collected user requirements, conceptual models implied by the selected implementation platform as well as available ontology resources and standards are employed in the ontology creation. The process of semantic annotation is described and illustrated using an example taken from a real application case.

  2. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  3. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  4. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    NASA Astrophysics Data System (ADS)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  5. The Development of an Illustration Design Model.

    ERIC Educational Resources Information Center

    Rankin, Robert O.

    1989-01-01

    Describes a method for isolating factors that influence the ability of a reader to comprehend the message of an illustration, and explains a model that is constructed from these factors to highlight potential problem stages in an author's design of an illustration. The model is based on data from comprehension trials with college students. (17…

  6. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  7. Instructional Design in Education: New Model

    ERIC Educational Resources Information Center

    Isman, Aytekin

    2011-01-01

    The main goal of the new instructional design model is to organize long term and full learning activities. The new model is based on the theoretical foundation of behaviorism, cognitivism and constructivism. During teaching and learning activities, learners are active and use cognitive, constructivist, or behaviorist learning to construct new…

  8. Development of the multichannel data processing ASIC design flow

    NASA Astrophysics Data System (ADS)

    Ivanov, P. Y.; Atkin, E. V.; Normanov, D. D.; Shumkin, O. V.

    2017-01-01

    In modern multichannel data processing digital systems the number of channels ranges from some hundred thousand to millions. The basis of the elemental base of these systems are ASICs. Their most important characteristics are performance, power consumption and occupied area. ASIC design is a time and labor consuming process. In order to improve performance and reduce the designing time it is proposed to supplement the standard design flow with an optimization stage of the channel parameters based on the most efficient use of chip area and power consumption.

  9. Framework for Modeling the Cognitive Process

    DTIC Science & Technology

    2005-06-16

    Yaworsky Air Force Research Laboratory/IFSB Rome, NY Keywords: Cognitive Process Modeling, Cognition, Conceptual Framework , Information...center of our conceptual framework and will distinguish our use of terms within the context of this framework. 3. A Conceptual Framework for...Modeling the Cognitive Process We will describe our conceptual framework using graphical examples to help illustrate main points. We form the two

  10. An Extension to the Weibull Process Model

    DTIC Science & Technology

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  11. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  12. DESIGNING CHEMICAL PROCESSES WITH OPEN AND FUGITIVE EMISSIONS

    EPA Science Inventory

    Designing a chemical process normally includes aspects of economic and environmental disciplines. In this work we describe methods to quickly and easily evaluate the conomics and potential environmental impacts of a process, with the hydrodealkylation of toluene as an example. Th...

  13. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  14. Propulsion System Models for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2014-01-01

    The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.

  15. Hybrid modelling of anaerobic wastewater treatment processes.

    PubMed

    Karama, A; Bernard, O; Genovesi, A; Dochain, D; Benhammou, A; Steyer, J P

    2001-01-01

    This paper presents a hybrid approach for the modelling of an anaerobic digestion process. The hybrid model combines a feed-forward network, describing the bacterial kinetics, and the a priori knowledge based on the mass balances of the process components. We have considered an architecture which incorporates the neural network as a static model of unmeasured process parameters (kinetic growth rate) and an integrator for the dynamic representation of the process using a set of dynamic differential equations. The paper contains a description of the neural network component training procedure. The performance of this approach is illustrated with experimental data.

  16. Model development for naphthenic acids ozonation process.

    PubMed

    Al Jibouri, Ali Kamel H; Wu, Jiangning

    2015-02-01

    Naphthenic acids (NAs) are toxic constituents of oil sands process-affected water (OSPW) which is generated during the extraction of bitumen from oil sands. NAs consist mainly of carboxylic acids which are generally biorefractory. For the treatment of OSPW, ozonation is a very beneficial method. It can significantly reduce the concentration of NAs and it can also convert NAs from biorefractory to biodegradable. In this study, a factorial design (2(4)) was used for the ozonation of OSPW to study the influences of the operating parameters (ozone concentration, oxygen/ozone flow rate, pH, and mixing) on the removal of a model NAs in a semi-batch reactor. It was found that ozone concentration had the most significant effect on the NAs concentration compared to other parameters. An empirical model was developed to correlate the concentration of NAs with ozone concentration, oxygen/ozone flow rate, and pH. In addition, a theoretical analysis was conducted to gain the insight into the relationship between the removal of NAs and the operating parameters.

  17. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  18. Development of a Comprehensive Weld Process Model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

  19. The shielding design process--new plants to decommissioning.

    PubMed

    Jeffries, Graham; Cooper, Andrew; Hobson, John

    2005-01-01

    BNFL have over 25 years experience of designing nuclear plant for the whole-fuel cycle. In the UK, a Nuclear Decommissioning Authority (NDA) is to be set up to ensure that Britain's nuclear legacy is cleaned up safely, securely and cost effectively. The resulting challenges and opportunities for shielding design will be substantial as the shielding design process was originally devised for the design of new plants. Although its underlying principles are equally applicable to decommissioning and remediation of old plants, there are many aspects of detailed application that need to adapt to this radically different operating environment. The paper describes both the common issues and the different challenges of shielding design at different operational phases. Sample applications will be presented of both new plant and decommissioning projects that illustrate not only the robust nature of the processes being used, but also how they lead to cost-effective solutions making a substantive and appropriate contribution to radiological protection goals.

  20. System Design Support by Optimization Method Using Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  1. Computer-aided design tools for economical MEMS fabrication processes

    NASA Astrophysics Data System (ADS)

    Schneider, Christian; Priebe, Andreas; Brueck, Rainer; Hahn, Kai

    1999-03-01

    Since the early 70s when microsystem technology was first introduce an enormous market for MST-products has been developed. Airbag sensors, micro pumps, ink jet nozzles etc. and the market is just about to start up. Establishing these products for a reasonable price requires mass production. Meanwhile, also computer-based design-tools have been developed in order to reduce the expenses for MST-design. In contrast to other physical design processes like e.g. in micro electronics, MEMS physical design is characterized by the fact that each product requires a tailored sequence of fabrication steps, usually selected from a variety of processing alternatives. The selection from these alternatives is based on economical constraints. Therefore, the design has a strong influence on the money and time spent to take an MST-product to market.

  2. Declarative business process modelling: principles and modelling languages

    NASA Astrophysics Data System (ADS)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  3. Learning from Experts: Fostering Extended Thinking in the Early Phases of the Design Process

    ERIC Educational Resources Information Center

    Haupt, Grietjie

    2015-01-01

    Empirical evidence on the way in which expert designers from different domains cognitively connect their internal processes with external resources is presented in the context of an extended cognition model. The article focuses briefly on the main trends in the extended design cognition theory and in particular on recent trends in information…

  4. Transition to an individual-room NICU design: process and outcome measures.

    PubMed

    Milford, Cheryl A; Zapalo, Barbara J; Davis, Glenda

    2008-01-01

    Redesign of a neonatal intensive care unit is a major budget undertaking, demanding accountability for its equipment and feasibility of design. It must be philosophically based and driven by research supporting best practice. The NJCU at the Magee-Womens Hospital of the University of Pittsburgh Medical Center, a Level III, 74-bed unit, has made the change from a ward design to an individual-room design suitable for family-centered, developmentally supportive care. This article presents the design process as it occurred. Unique to this process are the involvement of NJCU-graduate families and the use of transition teams. Guidelines and recommendations are offered to others interested in designing and practicing in an individual-room NJCU. Outcome data demonstrate staff adjustment to the new design and practice model. A comparison of this NICU design is made with the Recommended Standards for Newborn ICU Design.

  5. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  6. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  7. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  8. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  9. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  10. Evolutionary objections to "alien design" models.

    NASA Astrophysics Data System (ADS)

    Coffey, E. J.

    A previous paper demonstrated that the principal supporters of SETI have ignored the biological and evolutionary consequences of a creature's body form. In fact, the supporting evidence they provide actually contradicts their view. The approach they employ is that of the engineer: the process of "designing" a hypothetical creature to a specification irrespective of biological or evolutionary considerations. The principal types of "alien designs" which have been employed shall be discussed, and the evolutionary objections to them given.

  11. Threat processing: models and mechanisms.

    PubMed

    Bentz, Dorothée; Schiller, Daniela

    2015-01-01

    The experience of fear is closely linked to the survival of species. Fear can be conceptualized as a brain state that orchestrates defense reactions to threats. To avoid harm, an organism must be equipped with neural circuits that allow learning, detecting, and rapidly responding to threats. Past experience with threat can transform neutral stimuli present at the time of experience into learned threat-related stimuli via associative learning. Pavlovian threat conditioning is the central experimental paradigm to study associative learning. Once learned, these stimulus-response associations are not always expressed depending on context or new experiences with the conditioned stimuli. Neural circuits mediating threat learning have the inherent plasticity to adapt to changing environmental threats. Encounters devoid of danger pave the way for extinction or reconsolidation to occur. Extinction and reconsolidation can both lead to changes in the expression of threat-induced defense responses, but differ in stability and have a different neural basis. This review presents the behavioral models and the system-level neural mechanisms in animals and humans of threat learning and modulation.

  12. Design of a distributed CORBA based image processing server.

    PubMed

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  13. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  14. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  15. An information processing model of anxiety: automatic and strategic processes.

    PubMed

    Beck, A T; Clark, D A

    1997-01-01

    A three-stage schema-based information processing model of anxiety is described that involves: (a) the initial registration of a threat stimulus; (b) the activation of a primal threat mode; and (c) the secondary activation of more elaborative and reflective modes of thinking. The defining elements of automatic and strategic processing are discussed with the cognitive bias in anxiety reconceptualized in terms of a mixture of automatic and strategic processing characteristics depending on which stage of the information processing model is under consideration. The goal in the treatment of anxiety is to deactivate the more automatic primal threat mode and to strengthen more constructive reflective modes of thinking. Arguments are presented for the inclusion of verbal mediation as a necessary but not sufficient component in the cognitive and behavioral treatment of anxiety.

  16. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  17. Space Station Freedom Program preliminary design review process

    NASA Technical Reports Server (NTRS)

    Carlise, R. F.; Adair, Billy

    1989-01-01

    To conduct the Program Requirements Review of the Space Station Freedom, a Preliminary Design Review Board (PDR) has been established. The PDR will assess the preliminary design of the assembled manned base including the assembly process, the launch, and on-orbit stage configuration, the design approach, the on-orbit verification plans, supportability, reliability, safety, interfaces with the NASA infrastructure (the NSTS, TDRSS, and Ground operations) and international partners. Issues such as the coordination of a common interpretation of design requirements, coordination of interfaces, and convergence of design perspectives vs. proper allocation of resources are discussed. The impact of the resolution of the secondary ripple effect of design decisions which may cause programmatic difficulties is also addressed.

  18. Panoramic imaging perimeter sensor design and modeling

    SciTech Connect

    Pritchard, D.A.

    1993-12-31

    This paper describes the conceptual design and preliminary performance modeling of a 360-degree imaging sensor. This sensor combines automatic perimeter intrusion detection with immediate visual assessment and is intended to be used for fast deployment around fixed or temporary high-value assets. The sensor requirements, compiled from various government agencies, are summarized. The conceptual design includes longwave infrared and visible linear array technology. An auxiliary millimeter-wave sensing technology is also considered for use during periods of infrared and visible obscuration. The infrared detectors proposed for the sensor design are similar to the Standard Advanced Dewar Assembly Types Three A and B (SADA-IIIA/B). An overview of the sensor and processor is highlighted. The infrared performance of this sensor design has been predicted using existing thermal imaging system models and is described in the paper. Future plans for developing a prototype are also presented.

  19. Three Models for the Curriculum Development Process

    ERIC Educational Resources Information Center

    O'Hanlon, James

    1973-01-01

    Presents descriptions of the management, systematic, and open-access curriculum development models to identify the decisionmaking bases, operational processes, evaluation requirements, and curriculum control methods of each model. A possible relationship among these models is then suggested. (Author/DN)

  20. A Biologically Inspired Network Design Model

    PubMed Central

    Zhang, Xiaoge; Adamatzky, Andrew; Chan, Felix T.S.; Deng, Yong; Yang, Hai; Yang, Xin-She; Tsompanas, Michail-Antisthenis I.; Sirakoulis, Georgios Ch.; Mahadevan, Sankaran

    2015-01-01

    A network design problem is to select a subset of links in a transport network that satisfy passengers or cargo transportation demands while minimizing the overall costs of the transportation. We propose a mathematical model of the foraging behaviour of slime mould P. polycephalum to solve the network design problem and construct optimal transport networks. In our algorithm, a traffic flow between any two cities is estimated using a gravity model. The flow is imitated by the model of the slime mould. The algorithm model converges to a steady state, which represents a solution of the problem. We validate our approach on examples of major transport networks in Mexico and China. By comparing networks developed in our approach with the man-made highways, networks developed by the slime mould, and a cellular automata model inspired by slime mould, we demonstrate the flexibility and efficiency of our approach. PMID:26041508

  1. Physics and Process Modeling (PPM) and Other Propulsion R and T. Volume 1; Materials Processing, Characterization, and Modeling; Lifting Models

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This CP contains the extended abstracts and presentation figures of 36 papers presented at the PPM and Other Propulsion R&T Conference. The focus of the research described in these presentations is on materials and structures technologies that are parts of the various projects within the NASA Aeronautics Propulsion Systems Research and Technology Base Program. These projects include Physics and Process Modeling; Smart, Green Engine; Fast, Quiet Engine; High Temperature Engine Materials Program; and Hybrid Hyperspeed Propulsion. Also presented were research results from the Rotorcraft Systems Program and work supported by the NASA Lewis Director's Discretionary Fund. Authors from NASA Lewis Research Center, industry, and universities conducted research in the following areas: material processing, material characterization, modeling, life, applied life models, design techniques, vibration control, mechanical components, and tribology. Key issues, research accomplishments, and future directions are summarized in this publication.

  2. POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION

    EPA Science Inventory

    The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...

  3. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  4. Expectation changes and team characteristics in a participatory design process.

    PubMed

    Bazley, Conne Mara; De Jong, Annelise; Vink, Peter

    2012-01-01

    A human factors specialist researched the expectations of a culturally and professionally diverse team throughout a year long participatory design process of a large processing facility. For a deeper understanding of high-level team expectations and characteristics, the specialist collected data and information through in-situ ethnography and traditional case study methods, personal interviews, and a questionnaire that included a likert scale rating for expectation levels. Results found that expectation levels rated extremely satisfied for individual team members and the overall team itself before and during the participatory process. In contrast, expectations for upper management from the team were satisfied before the participatory process, but changed to uncertain, to unsatisfied, to extremely unsatisfied during the process. Additionally, the participatory design team exhibited high-level team characteristics to include honesty, competence, commitment, communication, creativity, and clear expectations.

  5. Subject Access in Online Catalogs: A Design Model.

    ERIC Educational Resources Information Center

    Bates, Marcia J.

    1986-01-01

    Presents a model for the design of online catalog subject access based on three principles: the uncertainty of subject indexing, the need for greater variety in searcher's queries, and the complexity of the search process. The proposed system is based on existing Library of Congress subject cataloging. (EM)

  6. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  7. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  8. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    . This work is an early attempt that uses a physical-statistical modeling approach to studying selective nanowire growth for the improvement of process yield. In the second research work, the reliability of nano-dielectrics is investigated. As electronic devices get smaller, reliability issues pose new challenges due to unknown underlying physics of failure (i.e., failure mechanisms and modes). This necessitates new reliability analysis approaches related to nano-scale devices. One of the most important nano-devices is the transistor that is subject to various failure mechanisms. Dielectric breakdown is known to be the most critical one and has become a major barrier for reliable circuit design in nano-scale. Due to the need for aggressive downscaling of transistors, dielectric films are being made extremely thin, and this has led to adopting high permittivity (k) dielectrics as an alternative to widely used SiO2 in recent years. Since most time-dependent dielectric breakdown test data on bilayer stacks show significant deviations from a Weibull trend, we have proposed two new approaches to modeling the time to breakdown of bi-layer high-k dielectrics. In the first approach, we have used a marked space-time self-exciting point process to model the defect generation rate. A simulation algorithm is used to generate defects within the dielectric space, and an optimization algorithm is employed to minimize the Kullback-Leibler divergence between the empirical distribution obtained from the real data and the one based on the simulated data to find the best parameter values and to predict the total time to failure. The novelty of the presented approach lies in using a conditional intensity for trap generation in dielectric that is a function of time, space and size of the previous defects. In addition, in the second approach, a k-out-of-n system framework is proposed to estimate the total failure time after the generation of more than one soft breakdown.

  9. Conceptual process designs: Lurgi-Ruhrgas and superior circular grate

    SciTech Connect

    Not Available

    1980-09-01

    Based on preliminary data on retort yields and previous conceptual designs, a Design Basis has been prepared, and an upgrading scheme developed against which all five retorting processes can be evaluated. Licensors for retorting technologies (American Lurgi Corporation for the Lurgi retort, Davy-McKee for the Superior Circular Grate retort, and Union Oil Company for the Union B retort); hydrotreaters (Gulf Oil and Chevron); wastewater treatment units (Chevron); and sulfur plants (Parsons) have been contacted for data related to their processes. Preliminary balances for the Lurgi and Superior processes have been developed and will be compared against the vendor information when received. A preliminary design basis is presented. This presents design assumptions and conditions to be used in developing the process designs, heat and material balances, and process flow diagrams for all cases. The shale oil upgrading scheme selected to be used in all evaluations consists of delayed coking the 850/sup 0/F plus fraction from the shale oil, and hydrotreating all virgin and coker naphthas and gas oils in separate hydrotreaters. This scheme was selected because it is simple and each of the units has proven to be reliable in refining conventional crude oils. Also, this upgrading scheme is not expected to penalize any specific retort system. The material and utility balances, along with process flow diagrams for Case I, the Lurgi-Ruhrgas process are given. In this case, 46,500 bpsd of 29.4 /sup 0/API upgraded shale oil are produced. The Superior Circular Grate material and utility balances and process flow diagrams are also given. The liquid product from this case is 40,500 bpsd of 27.4 /sup 0/API upgraded shale oil.

  10. Modeling cellular processes in 3D.

    PubMed

    Mogilner, Alex; Odde, David

    2011-12-01

    Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated we must address the issue of modeling cellular processes in 3D. Here, we highlight recent advances related to 3D modeling in cell biology. While some processes require full 3D analysis, we suggest that others are more naturally described in 2D or 1D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling.

  11. Job Aiding/Training Decision Process Model

    DTIC Science & Technology

    1992-09-01

    I[ -, . 1’, oo Ii AL-CR-i1992-0004 AD-A256 947lEE = IIEI ifl ll 1l I JOB AIDING/TRAINING DECISION PROCESS MODEL A R M John P. Zenyuh DTIC S Phillip C...March 1990 - April 1990 4. TITLE AND SUBTITLE S. FUNDING NUMBERS C - F33615-86-C-0545 Job Aiding/Training Decision Process Model PE - 62205F PR - 1121 6...Components to Process Model Decision and Selection Points ........... 32 13. Summary of Subject Recommendations for Aiding Approaches

  12. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  13. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  14. Process-induced bias: a study of resist design, device node, illumination conditions, and process implications

    NASA Astrophysics Data System (ADS)

    Carcasi, Michael; Scheer, Steven; Fonseca, Carlos; Shibata, Tsuyoshi; Kosugi, Hitoshi; Kondo, Yoshihiro; Saito, Takashi

    2009-03-01

    Critical dimension uniformity (CDU) has both across field and across wafer components. CD error generated by across wafer etching non-uniformity and other process variations can have a significant impact on CDU. To correct these across wafer systematic variations, compensation by exposure dose and/or post exposure bake (PEB) temperature have been proposed. These compensation strategies often focus on a specific structure without evaluating how process compensation impacts the CDU of all structures to be printed in a given design. In one previous study limited to a single resist and minimal coater/developer and scanner variations, the authors evaluated the relative merits of across wafer dose and PEB temperature compensation on the process induced CD bias and CDU. For the process studied, it was found that using PEB temperature to control CD across wafer was preferable to using dose compensation. In another previous study, the impact of resist design was explored to understand how resist design, as well as coater/developer and scanner processing, impact process induced bias (PIB). The previous PIB studies were limited to a single illumination case and explore the effect of PIB on only L/S structures. It is the goal of this work to understand additionally how illumination design and mask design, as well as resist design and coater/developer and scanner processing, impact process induced bias (PIB)/OPC integrity.

  15. The design of bearing processing technology and fixture

    NASA Astrophysics Data System (ADS)

    Liu, Sibo

    2017-03-01

    This paper is designed for bearing processing technology and fixture. The main task is to work out the half fine milling under 36mm, Φ18 holes, bearing the processing technology of the rules of procedure, and write CARDS. Its parts are casting, which is small and of simple structure. Moreover, the components of the hole processing is higher than that of the surface, so the processing order of surface first is taken. The fixture special jig is adopted in each working procedure, among which in a drill Φ18 holes, the hydraulic clamping is used, which is simple, convenient and can meet the requirements.

  16. Design of launch systems using continuous improvement process

    NASA Technical Reports Server (NTRS)

    Brown, Richard W.

    1995-01-01

    The purpose of this paper is to identify a systematic process for improving ground operations for future launch systems. This approach is based on the Total Quality Management (TQM) continuous improvement process. While the continuous improvement process is normally identified with making incremental changes to an existing system, it can be used on new systems if they use past experience as a knowledge base. In the case of the Reusable Launch Vehicle (RLV), the Space Shuttle operations provide many lessons. The TQM methodology used for this paper will be borrowed from the United States Air Force 'Quality Air Force' Program. There is a general overview of the continuous improvement process, with concentration on the formulation phase. During this phase critical analyses are conducted to determine the strategy and goals for the remaining development process. These analyses include analyzing the mission from the customers point of view, developing an operations concept for the future, assessing current capabilities and determining the gap to be closed between current capabilities and future needs and requirements. A brief analyses of the RLV, relative to the Space Shuttle, will be used to illustrate the concept. Using the continuous improvement design concept has many advantages. These include a customer oriented process which will develop a more marketable product and a better integration of operations and systems during the design phase. But, the use of TQM techniques will require changes, including more discipline in the design process and more emphasis on data gathering for operational systems. The benefits will far outweigh the additional effort.

  17. Biostereometric Data Processing In ERGODATA: Choice Of Human Body Models

    NASA Astrophysics Data System (ADS)

    Pineau, J. C.; Mollard, R.; Sauvignon, M.; Amphoux, M.

    1983-07-01

    The definition of human body models was elaborated with anthropometric data from ERGODATA. The first model reduces the human body into a series of points and lines. The second model is well adapted to represent volumes of each segmentary element. The third is an original model built from the conventional anatomical points. Each segment is defined in space by a tri-angular plane located with its 3-D coordinates. This new model can answer all the processing possibilities in the field of computer-aided design (C.A.D.) in ergonomy but also biomechanics and orthopaedics.

  18. Waste receiving and processing facility module 1, detailed design report

    SciTech Connect

    Not Available

    1993-10-01

    WRAP 1 baseline documents which guided the technical development of the Title design included: (a) A/E Statement of Work (SOW) Revision 4C: This DOE-RL contractual document specified the workscope, deliverables, schedule, method of performance and reference criteria for the Title design preparation. (b) Functional Design Criteria (FDC) Revision 1: This DOE-RL technical criteria document specified the overall operational criteria for the facility. The document was a Revision 0 at the beginning of the design and advanced to Revision 1 during the tenure of the Title design. (c) Supplemental Design Requirements Document (SDRD) Revision 3: This baseline criteria document prepared by WHC for DOE-RL augments the FDC by providing further definition of the process, operational safety, and facility requirements to the A/E for guidance in preparing the design. The document was at a very preliminary stage at the onset of Title design and was revised in concert with the results of the engineering studies that were performed to resolve the numerous technical issues that the project faced when Title I was initiated, as well as, by requirements established during the course of the Title II design.

  19. Preliminary shuttle structural dynamics modeling design study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and development of a structural dynamics model of the space shuttle are discussed. The model provides for early study of structural dynamics problems, permits evaluation of the accuracy of the structural and hydroelastic analysis methods used on test vehicles, and provides for efficiently evaluating potential cost savings in structural dynamic testing techniques. The discussion is developed around the modes in which major input forces and responses occur and the significant structural details in these modes.

  20. Using CASE to Exploit Process Modeling in Technology Transfer

    NASA Technical Reports Server (NTRS)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  1. Development of a reburning boiler process model

    SciTech Connect

    Wu, K.T.

    1992-01-30

    The overall objective of this program is to integrate EER's expertise in boiler reburning performance evaluation into a package of analytical computer tools. Specific objectives of the program are to develop a computational capability with the following features: (1) can be used to predict the impact of gas reburning application on thermal conditions in the boiler radiant furnace, and on overall boiler performance; (2) can estimate gas reburning NO{sub x} reduction effectiveness based on specific reburning configurations and furnace/boiler configurations; (3) can be used as an analytical tool to evaluate the impact of boiler process parameters (e.g., fuel switching and changes in boiler operating conditions) on boiler thermal performance; (4) is adaptable to most boiler designs (tangential and wall fire boilers) and a variety of fuels (solid, liquid, gaseous and slurried fuels); (5) is sufficiently user friendly to be exercisable by engineers with a reasonable knowledge of boilers, and with reasonable computer skills. Here, user friendly'' means that the user will be guided by computer codes during the course of setting up individual input files for the boiler performance model.

  2. Rethinking behavioral health processes by using design for six sigma.

    PubMed

    Lucas, Anthony G; Primus, Kelly; Kovach, Jamison V; Fredendall, Lawrence D

    2015-02-01

    Clinical evidence-based practices are strongly encouraged and commonly utilized in the behavioral health community. However, evidence-based practices that are related to quality improvement processes, such as Design for Six Sigma, are often not used in behavioral health care. This column describes the unique partnership formed between a behavioral health care provider in the greater Pittsburgh area, a nonprofit oversight and monitoring agency for behavioral health services, and academic researchers. The authors detail how the partnership used the multistep process outlined in Design for Six Sigma to completely redesign the provider's intake process. Implementation of the redesigned process increased access to care, decreased bad debt and uncollected funds, and improved cash flow--while consumer satisfaction remained high.

  3. Visualization System Requirements for Data Processing Pipeline Design and Optimization.

    PubMed

    von Landesberger, Tatiana; Fellner, Dieter; Ruddle, Roy

    2016-08-25

    The rising quantity and complexity of data creates a need to design and optimize data processing pipelines - the set of data processing steps, parameters and algorithms that perform operations on the data. Visualization can support this process but, although there are many examples of systems for visual parameter analysis, there remains a need to systematically assess users' requirements and match those requirements to exemplar visualization methods. This article presents a new characterization of the requirements for pipeline design and optimization. This characterization is based on both a review of the literature and first-hand assessment of eight application case studies. We also match these requirements with exemplar functionality provided by existing visualization tools. Thus, we provide end-users and visualization developers with a way of identifying functionality that addresses data processing problems in an application. We also identify seven future challenges for visualization research that are not met by the capabilities of today's systems.

  4. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  5. COST ESTIMATION MODELS FOR DRINKING WATER TREATMENT UNIT PROCESSES

    EPA Science Inventory

    Cost models for unit processes typically utilized in a conventional water treatment plant and in package treatment plant technology are compiled in this paper. The cost curves are represented as a function of specified design parameters and are categorized into four major catego...

  6. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  7. Computer Aided Modeling and Post Processing with NASTRAN Analysis

    NASA Technical Reports Server (NTRS)

    Boroughs, R. R.

    1984-01-01

    Computer aided engineering systems are invaluable tools in performing NASTRAN finite element analysis. These techniques are implemented in both the pre-processing and post-processing phases of the NASTRAN analysis. The finite element model development, or pre-processing phase, was automated with a computer aided modeling program called Supertabl, and the review and interpretation of the results of the NASTRAN analysis, or post-processing phase, was automated with a computer aided plotting program called Output Display. An intermediate program, Nasplot, which was developed in-house, has also helped to cut down on the model checkout time and reduce errors in the model. An interface has been established between the finite element computer aided engineering system and the Learjet computer aided design system whereby data can be transferred back and forth between the two. These systems have significantly improved productivity and the ability to perform NASTRAN analysis in response to product development requests.

  8. Simulation models and designs for advanced Fischer-Tropsch technology

    SciTech Connect

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for the products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.

  9. Design of self-processing antimicrobial peptides for plant protection.

    PubMed

    Powell, W A; Catranis, C M; Maynard, C A

    2000-08-01

    Small antimicrobial peptides are excellent candidates for inclusion in self-processing proteins that could be used to confer pathogen resistance in transgenic plants. Antimicrobial peptides as small as 22 amino acids in length have been designed to incorporate the residual amino acids left from protein processing by the tobacco etch virus'(TEVs') NIa protease. Also, by minimizing the length of these peptides and the number of highly hydrophobic residues, haemolytic activity was reduced without affecting the peptide's antimicrobial activity.

  10. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  11. Motivating the Notion of Generic Design within Information Processing Theory: The Design Problem Space.

    ERIC Educational Resources Information Center

    Goel, Vinod; Pirolli, Peter

    The notion of generic design, while it has been around for 25 years, is not often articulated, especially within Newell and Simon's (1972) Information Processing Theory framework. Design is merely lumped in with other forms of problem solving activity. Intuitively it is felt that there should be a level of description of the phenomenon which…

  12. Which Events Can Cause Iteration in Instructional Design? An Empirical Study of the Design Process

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2006-01-01

    Instructional design is not a linear process: designers have to weigh the advantages and disadvantages of alternative solutions, taking into account different kinds of conflicting and changing constraints. To make sure that they eventually choose the most optimal one, they have to keep on collecting information, reconsidering continuously whether…

  13. Design Considerations for the Construction and Operation of Flour Milling Facilities. Part II: Process Design Considerations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Flour milling facilities have been the cornerstone of agricultural processing for centuries. Like most agri-industrial production facilities, flour milling facilities have a number of unique design requirements. Design information, to date, has been limited. In an effort to summarize state of the ...

  14. Quality by design approach of a pharmaceutical gel manufacturing process, part 1: determination of the design space.

    PubMed

    Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalá, Manel

    2011-10-01

    This work was conducted in the framework of a quality by design project involving the production of a pharmaceutical gel. Preliminary work included the identification of the quality target product profiles (QTPPs) from historical values for previously manufactured batches, as well as the critical quality attributes for the process (viscosity and pH), which were used to construct a D-optimal experimental design. The experimental design comprised 13 gel batches, three of which were replicates at the domain center intended to assess the reproducibility of the target process. The viscosity and pH models established exhibited very high linearity and negligible lack of fit (LOF). Thus, R(2) was 0.996 for viscosity and 0.975 for pH, and LOF was 0.53 for the former parameter and 0.84 for the latter. The process proved reproducible at the domain center. Water content and temperature were the most influential factors for viscosity, and water content and acid neutralized fraction were the most influential factors for pH. A desirability function was used to find the best compromise to optimize the QTPPs. The body of information was used to identify and define the design space for the process. A model capable of combining the two response variables into a single one was constructed to facilitate monitoring of the process.

  15. Aspect-Oriented Design with Reusable Aspect Models

    NASA Astrophysics Data System (ADS)

    Kienzle, Jörg; Al Abed, Wisam; Fleurey, Franck; Jézéquel, Jean-Marc; Klein, Jacques

    The idea behind Aspect-Oriented Modeling (AOM) is to apply aspect-oriented techniques to (software) models with the aim of modularizing crosscutting concerns. This can be done within different modeling notations, at different levels of abstraction, and at different moments during the software development process. This paper demonstrates the applicability of AOM during the software design phase by presenting parts of an aspect-oriented design of a crisis management system. The design solution proposed in this paper is based on the Reusable Aspect Models (RAM) approach, which allows a modeler to express the structure and behavior of a complex system using class, state and sequence diagrams encapsulated in several aspect models. The paper describes how the model of the "create mission" functionality of the server backend can be decomposed into 23 inter-dependent aspect models. The presentation of the design is followed by a discussion on the lessons learned from the case study. Next, RAM is compared to 8 other AOM approaches according to 6 criteria: language, concern composition, asymmetric and symmetric composition, maturity, and tool support. To conclude the paper, a discussion section points out the features of RAM that specifically support reuse.

  16. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  17. Self-conscious robotic system design process--from analysis to implementation.

    PubMed

    Chella, Antonio; Cossentino, Massimo; Seidita, Valeria

    2011-01-01

    Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.

  18. The (Mathematical) Modeling Process in Biosciences.

    PubMed

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  19. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  20. Noise control, sound, and the vehicle design process

    NASA Astrophysics Data System (ADS)

    Donavan, Paul

    2005-09-01

    For many products, noise and sound are viewed as necessary evils that need to be dealt with in order to bring the product successfully to market. They are generally not product ``exciters'' although some vehicle manufacturers do tune and advertise specific sounds to enhance the perception of their products. In this paper, influencing the design process for the ``evils,'' such as wind noise and road noise, are considered in more detail. There are three ingredients to successfully dealing with the evils in the design process. The first of these is knowing how excesses in noise effects the end customer in a tangible manner and how that effects customer satisfaction and ultimately sells. The second is having and delivering the knowledge of what is required of the design to achieve a satisfactory or even better level of noise performance. The third ingredient is having the commitment of the designers to incorporate the knowledge into their part, subsystem or system. In this paper, the elements of each of these ingredients are discussed in some detail and the attributes of a successful design process are enumerated.