Science.gov

Sample records for process modeling design

  1. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  2. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  3. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  4. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  5. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  6. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  7. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  8. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  9. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  10. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  11. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  12. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  13. Discussion: the design and analysis of the Gaussian process model

    SciTech Connect

    Williams, Brian J; Loeppky, Jason L

    2008-01-01

    The investigation of complex physical systems utilizing sophisticated computer models has become commonplace with the advent of modern computational facilities. In many applications, experimental data on the physical systems of interest is extremely expensive to obtain and hence is available in limited quantities. The mathematical systems implemented by the computer models often include parameters having uncertain values. This article provides an overview of statistical methodology for calibrating uncertain parameters to experimental data. This approach assumes that prior knowledge about such parameters is represented as a probability distribution, and the experimental data is used to refine our knowledge about these parameters, expressed as a posterior distribution. Uncertainty quantification for computer model predictions of the physical system are based fundamentally on this posterior distribution. Computer models are generally not perfect representations of reality for a variety of reasons, such as inadequacies in the physical modeling of some processes in the dynamic system. The statistical model includes components that identify and adjust for such discrepancies. A standard approach to statistical modeling of computer model output for unsampled inputs is introduced for the common situation where limited computer model runs are available. Extensions of the statistical methods to functional outputs are available and discussed briefly.

  14. Human performance model support for a human-centric design process

    NASA Astrophysics Data System (ADS)

    Campbell, Gwendolyn E.; Cannon-Bowers, Janis A.

    2000-11-01

    For years, systems designers following a traditional design process have made use of models of hardware and software. A human-centric design process imposes additional requirements and analyses on the designer, and we believe that additional types of models -- models of human performance -- are necessary to support this approach to design. Fortunately, there have been recent technological advances in our ability to model all aspects of human performance. This paper will describe three specific applications of human performance modeling that we are exploring to support the design of human- centric systems, such as future Navy ships. Specifically, this technology can be used to generate team design concepts, to provide human-centric decision support for systems engineers, and to allow simulation-based evaluation of human performance. We believe that human performance modeling technology has matured to the point where it can play a significant role in the human-centric design process, reducing both cost and risk.

  15. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  16. Behavioral modeling and simulation for the design process of aerospatial micro-instrumentation based on MEMS

    NASA Astrophysics Data System (ADS)

    Barrachina, L.; Lorente, B.; Ferrer, C.

    2006-05-01

    The extended use of microelectromechanical systems (MEMS) in the development of new microinstrumentation for aerospatial applications, which combine extreme sensitivity, accuracy and compactness, introduced the need to simplify their design process in order to reduce the design time and cost. The recent apparition of analogue and mixed signal extensions of hardware descriptions languages (VHDL-AMS, Verilog-AMS and SystemC-AMS) permits to co-simulate the HDL (VHDL and Verilog) design models for the digital signal processing and communication circuitry with behavioral models for the non digital parts (analog and mixed signal processing, RF circuitry and MEMS components). Since the beginning of the microinstrumentation design process the modeling and simulation could help to define better the specifications and in the architecture selection and in the SoC design process in a more realistic environment. We will present our experience in the application of these languages in the design of microinstruments by using behavioral modeling of MEMS.

  17. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  18. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  19. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  20. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    SciTech Connect

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  1. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  2. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  3. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  4. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  5. The Impact of Building Information Modeling on the Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  6. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. PMID:24309506

  7. Studies in process modeling, design, monitoring, and control, with applications to polymer composites manufacturing

    NASA Astrophysics Data System (ADS)

    Srinivasagupta, Deepak

    2002-01-01

    High material and manufacturing costs have hindered the introduction of advanced polymer composite materials into mainstream civilian applications such as automotive. Even though high-fidelity models for several polymer composite manufacturing processes have become available over the past several years and offer significant benefits in manufacturing cost reduction, concerns about their inflexibility and maintenance has adversely affected their widespread usage. This research seeks to advance process modeling and design in polymer composites manufacturing to address these concerns. Other more general issues in measurement validation and distributed control are also addressed. Using a rigorous 3-D model of the injected pultrusion (IP) process validated recently, an algorithm was developed for process and equipment design with integrated economic, operability and environmental considerations. The optimum design promised enhanced throughput as well as reduction in the time and expenses of the current purely experimental approaches. Scale-up issues in IP were analyzed, and refinements to overcome some drawbacks in the model were suggested. The process model was then extended to simulate the co-injection resin transfer molding (CIRTM) process used for manufacture of foam-core sandwich composites. A 1-D isothermal model for real-time control was also developed. Process optimization using these models and experimental parametric studies increased the debond fracture toughness of sandwiches by 78% over current technology. To ensure the availability of validated measurements from process instrumentation, a novel in-situ sensor modeling approach to sensor validation was proposed. Both active and passive, time and frequency domain techniques were developed, and experimentally verified using temperature and flow sensors. A model-based dynamic estimator to predict the true measurement online was also validated. The effect of network communication delay on stability and control

  8. A frequency response model matching method for PID controller design for processes with dead-time.

    PubMed

    Anwar, Md Nishat; Pan, Somnath

    2015-03-01

    In this paper, a PID controller design method for the integrating processes based on frequency response matching is presented. Two approaches are proposed for the controller design. In the first approach, a double feedback loop configuration is considered where the inner loop is designed with a stabilizing gain. In the outer loop, the parameters of the PID controller are obtained by frequency response matching between the closed-loop system with the PID controller and a reference model with desired specifications. In the second approach, the design is directly carried out considering a desired load-disturbance rejection model of the system. In both the approaches, two low frequency points are considered for matching the frequency response, which yield linear algebraic equations, solution of which gives the controller parameters. Several examples are taken from the literature to demonstrate the effectiveness and to compare with some well known design methods. PMID:25441218

  9. Process Design of Cryogenic Distribution System for CFETR CS Model Coil

    NASA Astrophysics Data System (ADS)

    Cheng, Anyi; Zhang, Qiyong; Fu, Bao; Lu, Xiaofei

    2016-02-01

    The superconducting magnet of Central Solenoid (CS) model coil of China Fusion Engineering Test Reactor (CFETR) is made of Nb3Sn/NbTi cable-in-conduit conductor (CICC), and operated by forced-flow cooling with a large amount of supercritical helium. The cryogenic circulation pump is analyzed and considered to be effective in achieving the supercritical helium (SHe) circulation for the forced-flow cooled (FFC) CICC magnet. A distributed system will be constructed for cooling the CFETR CS model coil. This paper presents the design of FFC process for the CFETR CS model coil. The equipment configuration, quench protection in the magnet and the process control are presented.

  10. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  11. A Problem-Based Learning Model for Teaching the Instructional Design Business Acquisition Process.

    ERIC Educational Resources Information Center

    Kapp, Karl M.; Phillips, Timothy L.; Wanner, Janice H.

    2002-01-01

    Outlines a conceptual framework for using a problem-based learning model for teaching the Instructional Design Business Acquisition Process. Discusses writing a response to a request for proposal, developing a working prototype, orally presenting the solution, and the impact of problem-based learning on students' perception of their confidence in…

  12. Letter Report. Defense Waste Processing Facility Pour Spout Heaters - Conceptual Designs and Modeling

    SciTech Connect

    SK Sundaram; JM Perez, Jr.

    2000-09-06

    The Tanks Focus Area (TFA) identified a major task to address performance limitations and deficiencies of the Defense Waste Processing Facility (DWPF) now in its sixth year of operation. Design, installation, testing, monitoring, operability, and a number of other characteristics were studied by research personnel collaboratively at a number of facilities: Savannah River Technology Center (SRTC), Clemson Environmental Technologies Laboratory (CETL), Pacific Northwest National Laboratory (PNNL), and the Idaho National Engineering and Environmental Laboratory (INEEL). Because the potential limiting feature to the DWPF was identified as the pour spout/riser heater, researches on alternative design concepts originally proposed in the past were revisited. In the original works, finite element modeling was performed to evaluate temperature distribution and stress of the design currently used at the DWPF. Studies were also made to define the requirements of the design and to consider the approaches for remote removal/replacement. Their heater type/location, their remotely replaceable thermocouples, and their capabilities for remote handling characterized the five alternative designs proposed. Review comments on the alternative designs indicated a relatively wide range of advantages and disadvantages of the designs. The present report provides an overview of the design criteria, modeling results, and alternative designs. Based on a review of the past design optimization activities and an assessment of recent experience, recommendations are proposed for future consideration and improvement.

  13. A Conceptual Aerospace Vehicle Structural System Modeling, Analysis and Design Process

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek

    2007-01-01

    A process for aerospace structural concept analysis and design is presented, with examples of a blended-wing-body fuselage, a multi-bubble fuselage concept, a notional crew exploration vehicle, and a high altitude long endurance aircraft. Aerospace vehicle structures must withstand all anticipated mission loads, yet must be designed to have optimal structural weight with the required safety margins. For a viable systems study of advanced concepts, these conflicting requirements must be imposed and analyzed early in the conceptual design cycle, preferably with a high degree of fidelity. In this design process, integrated multidisciplinary analysis tools are used in a collaborative engineering environment. First, parametric solid and surface models including the internal structural layout are developed for detailed finite element analyses. Multiple design scenarios are generated for analyzing several structural configurations and material alternatives. The structural stress, deflection, strain, and margins of safety distributions are visualized and the design is improved. Over several design cycles, the refined vehicle parts and assembly models are generated. The accumulated design data is used for the structural mass comparison and concept ranking. The present application focus on the blended-wing-body vehicle structure and advanced composite material are also discussed.

  14. Designing geo-spatial interfaces to scale process models: the GeoWEPP approach

    NASA Astrophysics Data System (ADS)

    Renschler, Chris S.

    2003-04-01

    Practical decision making in spatially distributed environmental assessment and management is increasingly based on environmental process models linked to geographical information systems. Powerful personal computers and Internet-accessible assessment tools are providing much greater public access to, and use of, environmental models and geo-spatial data. However traditional process models, such as the water erosion prediction project (WEPP), were not typically developed with a flexible graphical user interface (GUI) for applications across a wide range of spatial and temporal scales, utilizing readily available geo-spatial data of highly variable precision and accuracy, and communicating with a diverse spectrum of users with different levels of expertise. As the development of the geo-spatial interface for WEPP (GeoWEPP) demonstrates, the GUI plays a key role in facilitating effective communication between the tool developer and user about data and model scales. The GeoWEPP approach illustrates that it is critical to develop a scientific and functional framework for the design, implementation, and use of such geo-spatial model assessment tools. The way that GeoWEPP was developed and implemented suggests a framework and scaling theory leading to a practical approach for developing geo-spatial interfaces for process models. GeoWEPP accounts for fundamental water erosion processes, model, and users needs, but most important it also matches realistic data availability and environmental settings by enabling even non-GIS-literate users to assemble the available geo-spatial data quickly to start soil and water conservation planning. In general, it is potential users' spatial and temporal scales of interest, and scales of readily available data, that should drive model design or selection, as opposed to using or designing the most sophisticated process model as the starting point and then determining data needs and result scales.

  15. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity

  16. Error detection process - Model, design, and its impact on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y.-H.

    1984-01-01

    An analytical model is developed for computer error detection processes and applied to estimate their influence on system performance. Faults in the hardware, not in the design, are assumed to be the potential cause of transition to erroneous states during normal operations. The classification properties and associated recovery methods of error detection are discussed. The probability of obtaining an unreliable result is evaluated, along with the resulting computational loss. Error detection during design is considered and a feasible design space is outlined. Extension of the methods to account for the effects of extant multiple faults is indicated.

  17. Parameter-Level Data Flow Modeling Oriented to Product Design Process

    NASA Astrophysics Data System (ADS)

    Li, Shen; Shao, Xiao Dong; Zhang, Zhi Hua; Ge, Xiao Bo

    2015-12-01

    In this paper, a method of data flow modeling for a product design process oriented to data parameter is proposed. The data parameters are defined, which are classified as the basic data parameters and complex data parameters. The mechanism of the mapping relationship between different forms of documents and some basic data parameters, and a data transmission based on parameters, are constructed. Aiming at the characteristics of the iterative design process, the parameters version mechanism including node modification and iteration information is proposed. The data parameters transmission relationships are represented by a parameters network model (PNM) based on a directed graph. Finally, through the table of data parameters mapping onto the workflow node and PNM, the data ports and data links in the data flow model are generated automatically by the program. Validation in the 15-meter-diameter S/Ka frequency band antenna design process of the “Reflector, Back frame and Center part design” data flow model shows that the method can effectively shorten the time of data flow modeling and improve the data transmission efficiency.

  18. Ares Upper Stage Processes to Implement Model Based Design - Going Paperless

    NASA Technical Reports Server (NTRS)

    Gregory, Melanie

    2012-01-01

    Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.

  19. Future integrated design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1980-01-01

    The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

  20. A fuzzy model based adaptive PID controller design for nonlinear and uncertain processes.

    PubMed

    Savran, Aydogan; Kahraman, Gokalp

    2014-03-01

    We develop a novel adaptive tuning method for classical proportional-integral-derivative (PID) controller to control nonlinear processes to adjust PID gains, a problem which is very difficult to overcome in the classical PID controllers. By incorporating classical PID control, which is well-known in industry, to the control of nonlinear processes, we introduce a method which can readily be used by the industry. In this method, controller design does not require a first principal model of the process which is usually very difficult to obtain. Instead, it depends on a fuzzy process model which is constructed from the measured input-output data of the process. A soft limiter is used to impose industrial limits on the control input. The performance of the system is successfully tested on the bioreactor, a highly nonlinear process involving instabilities. Several tests showed the method's success in tracking, robustness to noise, and adaptation properties. We as well compared our system's performance to those of a plant with altered parameters with measurement noise, and obtained less ringing and better tracking. To conclude, we present a novel adaptive control method that is built upon the well-known PID architecture that successfully controls highly nonlinear industrial processes, even under conditions such as strong parameter variations, noise, and instabilities. PMID:24140160

  1. New process modeling [sic], design, and control strategies for energy efficiency, high product quality, and improved productivity in the process industries. Final project report

    SciTech Connect

    Ray, W. Harmon

    2002-06-05

    This project was concerned with the development of process design and control strategies for improving energy efficiency, product quality, and productivity in the process industries. In particular, (i) the resilient design and control of chemical reactors, and (ii) the operation of complex processing systems, was investigated. Specific topics studied included new process modeling procedures, nonlinear controller designs, and control strategies for multiunit integrated processes. Both fundamental and immediately applicable results were obtained. The new design and operation results from this project were incorporated into computer-aided design software and disseminated to industry. The principles and design procedures have found their way into industrial practice.

  2. (New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries)

    SciTech Connect

    Not Available

    1991-01-01

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  3. [New process modeling, design and control strategies for energy efficiency, high product quality and improved productivity in the process industries

    SciTech Connect

    Not Available

    1991-12-31

    Highlights are reported of work to date on: resilient design and control of chemical reactors (polymerization, packed bed), operation of complex processing systems (compensators for multivariable systems with delays and Right Half Plane zeroes, process identification and controller design for multivariable systems, nonlinear systems control, distributed parameter systems), and computer-aided design software (CONSYD, POLYRED, expert systems). 15 figs, 54 refs. (DLC)

  4. Design and tuning of standard additive model based fuzzy PID controllers for multivariable process systems.

    PubMed

    Harinath, Eranda; Mann, George K I

    2008-06-01

    This paper describes a design and two-level tuning method for fuzzy proportional-integral derivative (FPID) controllers for a multivariable process where the fuzzy inference uses the inference of standard additive model. The proposed method can be used for any n x n multi-input-multi-output process and guarantees closed-loop stability. In the two-level tuning scheme, the tuning follows two steps: low-level tuning followed by high-level tuning. The low-level tuning adjusts apparent linear gains, whereas the high-level tuning changes the nonlinearity in the normalized fuzzy output. In this paper, two types of FPID configurations are considered, and their performances are evaluated by using a real-time multizone temperature control problem having a 3 x 3 process system. PMID:18558531

  5. Process modeling and supply chain design for advanced biofuel production based on bio-oil gasification

    NASA Astrophysics Data System (ADS)

    Li, Qi

    As a potential substitute for petroleum-based fuel, second generation biofuels are playing an increasingly important role due to their economic, environmental, and social benefits. With the rapid development of biofuel industry, there has been an increasing literature on the techno-economic analysis and supply chain design for biofuel production based on a variety of production pathways. A recently proposed production pathway of advanced biofuel is to convert biomass to bio-oil at widely distributed small-scale fast pyrolysis plants, then gasify the bio-oil to syngas and upgrade the syngas to transportation fuels in centralized biorefinery. This thesis aims to investigate two types of assessments on this bio-oil gasification pathway: techno-economic analysis based on process modeling and literature data; supply chain design with a focus on optimal decisions for number of facilities to build, facility capacities and logistic decisions considering uncertainties. A detailed process modeling with corn stover as feedstock and liquid fuels as the final products is presented. Techno-economic analysis of the bio-oil gasification pathway is also discussed to assess the economic feasibility. Some preliminary results show a capital investment of 438 million dollar and minimum fuel selling price (MSP) of $5.6 per gallon of gasoline equivalent. The sensitivity analysis finds that MSP is most sensitive to internal rate of return (IRR), biomass feedstock cost, and fixed capital cost. A two-stage stochastic programming is formulated to solve the supply chain design problem considering uncertainties in biomass availability, technology advancement, and biofuel price. The first-stage makes the capital investment decisions including the locations and capacities of the decentralized fast pyrolysis plants and the centralized biorefinery while the second-stage determines the biomass and biofuel flows. The numerical results and case study illustrate that considering uncertainties can be

  6. Process Design of Wastewater Treatment for the NREL Cellulosic Ethanol Model

    SciTech Connect

    Steinwinder, T.; Gill, E.; Gerhardt, M.

    2011-09-01

    This report describes a preliminary process design for treating the wastewater from NREL's cellulosic ethanol production process to quality levels required for recycle. In this report Brown and Caldwell report on three main tasks: 1) characterization of the effluent from NREL's ammonia-conditioned hydrolyzate fermentation process; 2) development of the wastewater treatment process design; and 3) development of a capital and operational cost estimate for the treatment concept option. This wastewater treatment design was incorporated into NREL's cellulosic ethanol process design update published in May 2011 (NREL/TP-5100-47764).

  7. The Sulfur-Iodine Cycle: Process Analysis and Design Using Comprehensive Phase Equilibrium Measurements and Modeling

    SciTech Connect

    Thies, Mark C.; O'Connell, J. P.; Gorensek, Maximilian B.

    2010-01-10

    Of the 100+ thermochemical hydrogen cycles that have been proposed, the Sulfur-Iodine (S-I) Cycle is a primary target of international interest for the centralized production of hydrogen from nuclear power. However, the cycle involves complex and highly nonideal phase behavior at extreme conditions that is only beginning to be understood and modeled for process simulation. The consequence is that current designs and efficiency projections have large uncertainties, as they are based on incomplete data that must be extrapolated from property models. This situation prevents reliable assessment of the potential viability of the system and, even more, a basis for efficient process design. The goal of this NERI award (05-006) was to generate phase-equilibrium data, property models, and comprehensive process simulations so that an accurate evaluation of the S-I Cycle could be made. Our focus was on Section III of the Cycle, where the hydrogen is produced by decomposition of hydroiodic acid (HI) in the presence of water and iodine (I2) in a reactive distillation (RD) column. The results of this project were to be transferred to the nuclear hydrogen community in the form of reliable flowsheet models for the S-I process. Many of the project objectives were achieved. At Clemson University, a unique, tantalum-based, phase-equilibrium apparatus incorporating a view cell was designed and constructed for measuring fluid-phase equilibria for mixtures of iodine, HI, and water (known as HIx) at temperatures to 350 °C and pressures to 100 bar. Such measurements were of particular interest for developing a working understanding of the expected operation of the RD column in Section III. The view cell allowed for the IR observation and discernment of vapor-liquid (VL), liquid-liquid, and liquid-liquid-vapor (LLVE) equilibria for HIx systems. For the I2-H2O system, liquid-liquid equilibrium (LLE) was discovered to exist at temperatures up to 310-315 °C, in contrast to the models and

  8. Statistics-enhanced multistage process models for integrated design &manufacturing of poly (vinyl alcohol) treated buckypaper

    NASA Astrophysics Data System (ADS)

    Wang, Kan

    Carbon nanotube (CNT) is considered a promising engineering material because of its exceptional mechanical, electrical, and thermal properties. Buckypaper (BP), a thin sheet of assembled CNTs, is an effective way to handle CNTs in macro scale. Pristine BP is a fragile material which is held together by weak van der Waals attractions among CNTs. This dissertation introduces a modified filtration based manufacturing process which uses poly (vinyl alcohol) (PVA) to treat BP. This treatment greatly improves the handleability of BP, reduces the spoilage during transferring, and shortens the production time. The multistage manufacturing process of PVA-treated BP is discussed in this dissertation, and process models are developed to predict the nanostructure of final products from the process parameters. Based on the nanostructure, a finite element based physical model for prediction of Young's modulus is also developed. This accuracy of this physical model is further improved by statistical methods. The aim of this study is to investigate and improve the scalability of the manufacturing process of PVA-treated BP. To achieve this goal, various statistical tools are employed. The unique issues in nanomanufacturing also motivate the development of new statistical tools and modification of existing tools. Those issues include the uncertainties in nanostructure characterization due to the scale, limited number experimental data due to high cost of raw materials, large variation in final product due to the random nature in structure, and the high complexity in physical models due to the small scale of structural building blocks. This dissertation addresses those issues by combining engineering field knowledge and statistical methods. The resulting statistics-enhanced physical model provides an approach to design the manufacturing process of PVA-treated BP for a targeting property and tailor the robustness of the final product by manipulating the process parameters. In addition

  9. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering.

    PubMed

    Shanechi, Maryam M; Orsborn, Amy L; Carmena, Jose M

    2016-04-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain's behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user's motor intention during CLDA-a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter

  10. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  11. A Process Model for Developing Learning Design Patterns with International Scope

    ERIC Educational Resources Information Center

    Lotz, Nicole; Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2014-01-01

    This paper investigates the process of identifying design patterns in international collaborative learning environments. In this context, design patterns are referred to as structured descriptions of best practice with pre-defined sections such as problem, solution and consequences. We pay special attention to how the scope of a design pattern is…

  12. Thermal system design and modeling of meniscus controlled silicon growth process for solar applications

    NASA Astrophysics Data System (ADS)

    Wang, Chenlei

    The direct conversion of solar radiation to electricity by photovoltaics has a number of significant advantages as an electricity generator. That is, solar photovoltaic conversion systems tap an inexhaustible resource which is free of charge and available anywhere in the world. Roofing tile photovoltaic generation, for example, saves excess thermal heat and preserves the local heat balance. This means that a considerable reduction of thermal pollution in densely populated city areas can be attained. A semiconductor can only convert photons with the energy of the band gap with good efficiency. It is known that silicon is not at the maximum efficiency but relatively close to it. There are several main parts for the photovoltaic materials, which include, single- and poly-crystalline silicon, ribbon silicon, crystalline thin-film silicon, amorphous silicon, copper indium diselenide and related compounds, cadmium telluride, et al. In this dissertation, we focus on melt growth of the single- and poly-crystalline silicon manufactured by Czochralski (Cz) crystal growth process, and ribbon silicon produced by the edge-defined film-fed growth (EFG) process. These two methods are the most commonly used techniques for growing photovoltaic semiconductors. For each crystal growth process, we introduce the growth mechanism, growth system design, general application, and progress in the numerical simulation. Simulation results are shown for both Czochralski and EFG systems including temperature distribution of the growth system, velocity field inside the silicon melt and electromagnetic field for the EFG growth system. Magnetic field is applied on Cz system to reduce the melt convection inside crucible and this has been simulated in our numerical model. Parametric studies are performed through numerical and analytical models to investigate the relationship between heater power levels and solidification interface movement and shape. An inverse problem control scheme is developed to

  13. Mathematical modeling and analysis of EDM process parameters based on Taguchi design of experiments

    NASA Astrophysics Data System (ADS)

    Laxman, J.; Raj, K. Guru

    2015-12-01

    Electro Discharge Machining is a process used for machining very hard metals, deep and complex shapes by metal erosion in all types of electro conductive materials. The metal is removed through the action of an electric discharge of short duration and high current density between the tool and the work piece. The eroded metal on the surface of both work piece and the tool is flushed away by the dielectric fluid. The objective of this work is to develop a mathematical model for an Electro Discharge Machining process which provides the necessary equations to predict the metal removal rate, electrode wear rate and surface roughness. Regression analysis is used to investigate the relationship between various process parameters. The input parameters are taken as peak current, pulse on time, pulse off time, tool lift time. and the Metal removal rate, electrode wear rate and surface roughness are as responses. Experiments are conducted on Titanium super alloy based on the Taguchi design of experiments i.e. L27 orthogonal experiments.

  14. Design and control of energy efficient food drying processes with specific reference to quality; Model development and experimental studies: Moisture movement and dryer design

    SciTech Connect

    Kim, M.; Litchfield, B.; Singh, R.; Liang, H.; Narsimhan, G.; Waananen, K.

    1989-08-01

    The ultimate goal of the project is to develop procedures, techniques, data and other information that will aid in the design of cost effective and energy efficient drying processes that produce high quality foods. This objective has been sought by performing studies to determine the pertinent properties of food products, by developing models to describe the fundamental phenomena of food drying and by testing the models at laboratory scale. Finally, this information is used to develop recommendations and strategies for improved dryer design and control. This volume, Model Development and Experimental Studies, emphasizes the direct and indirect drying processes. An extensive literature review identifies key characteristics of drying models including controlling process resistances, internal mechanisms of moisture movement, structural and thermodynamic assumptions, and methods of model coefficients and material property measurement/determination, model solution, and model validation. Similarities and differences between previous work are noted, and strategies for future drying model development are suggested.

  15. Standardization Process for Space Radiation Models Used for Space System Design

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Daly, Eamonn; Brautigam, Donald

    2005-01-01

    The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.

  16. Designing risk communications: completing and correcting mental models of hazardous processes, Part I.

    PubMed

    Atman, C J; Bostrom, A; Fischhoff, B; Morgan, M G

    1994-10-01

    Many risk communications are intended to help the lay public make complex decisions about risk. To guide risk communicators with this objective, a mental models approach to the design and characterization of risk communications is proposed. Building on text comprehension and mental models research, this approach offers an integrated set of methods to help the risk communication designer choose and analyze risk communication content, structure, and organization. An applied example shows that two radon brochures designed with this approach present roughly the same expert facts as a radon brochure widely distributed by the U.S. EPA but meet higher standards on other content, structure, and organization criteria. PMID:7800862

  17. A mathematical examination of the press model for atmospheric turbulence. [aircraft design/random processes

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1975-01-01

    The random process used to model atmospheric turbulence in aircraft response problems is examined. The first, second, and higher order probability density and characteristic functions were developed. The concepts of the Press model lead to an approximate procedure for the analysis of the response of linear dynamic systems to a class of non-Gaussian random processes. The Press model accounts for both the Gaussian and non-Gaussian forms of measured turbulence data. The nonstationary aspects of measured data are explicitly described by the transition properties of the random process. The effects of the distribution of the intensity process upon calculated exceedances are examined. It is concluded that the press model with a Gaussian intensity distribution gives a conservative prediction of limit load values.

  18. Optimized aerodynamic design process for subsonic transport wing fitted with winglets. [wind tunnel model

    NASA Technical Reports Server (NTRS)

    Kuhlman, J. M.

    1979-01-01

    The aerodynamic design of a wind-tunnel model of a wing representative of that of a subsonic jet transport aircraft, fitted with winglets, was performed using two recently developed optimal wing-design computer programs. Both potential flow codes use a vortex lattice representation of the near-field of the aerodynamic surfaces for determination of the required mean camber surfaces for minimum induced drag, and both codes use far-field induced drag minimization procedures to obtain the required spanloads. One code uses a discrete vortex wake model for this far-field drag computation, while the second uses a 2-D advanced panel wake model. Wing camber shapes for the two codes are very similar, but the resulting winglet camber shapes differ widely. Design techniques and considerations for these two wind-tunnel models are detailed, including a description of the necessary modifications of the design geometry to format it for use by a numerically controlled machine for the actual model construction.

  19. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  20. Fracture design modelling

    SciTech Connect

    Crichlow, H.B.; Crichlow, H.B.

    1980-02-07

    A design tool is discussed whereby the various components that enter the design process of a hydraulic fracturing job are combined to provide a realistic appraisal of a stimulation job in the field. An interactive computer model is used to solve the problem numerically to obtain the effects of various parameters on the overall behavior of the system.

  1. Incorporating Eco-Evolutionary Processes into Population Models:Design and Applications

    EPA Science Inventory

    Eco-evolutionary population models are powerful new tools for exploring howevolutionary processes influence plant and animal population dynamics andvice-versa. The need to manage for climate change and other dynamicdisturbance regimes is creating a demand for the incorporation of...

  2. Design of RTDA controller for industrial process using SOPDT model with minimum or non-minimum zero.

    PubMed

    Anbarasan, K; Srinivasan, K

    2015-07-01

    This research paper focuses on the design and development of simplified RTDA control law computation formulae for SOPDT process with minimum or non-minimum zero. The design of RTDA control scheme consists of three main components namely process output prediction, model prediction update and control action computation. The systematic approach for computation of the above three components for SOPDT process with minimum or non-minimum zero is developed in this paper. The design, implementation and performance evaluation of the developed controller is demonstrated via simulation examples. The closed loop equation, block diagram representation and theoretical stability derivation for RTDA controller are developed. The performance of proposed controller is compared with IMC, SPC, MPC and PID controller and it is demonstrated on Industrial non-linear CSTR process. PMID:25820089

  3. Testing the Theoretical Design of a Health Risk Message: Reexamining the Major Tenets of the Extended Parallel Process Model

    ERIC Educational Resources Information Center

    Gore, Thomas D.; Bracken, Cheryl Campanella

    2005-01-01

    This study examined the fear control/danger control responses that are predicted by the Extended Parallel Process Model (EPPM). In a campaign designed to inform college students about the symptoms and dangers of meningitis, participants were given either a high-threat/no-efficacy or high-efficacy/no-threat health risk message, thus testing the…

  4. Modeling within-field gate length spatial variation for process-design co-optimization

    NASA Astrophysics Data System (ADS)

    Friedberg, Paul; Cao, Yu; Cain, Jason; Wang, Ruth; Rabaey, Jan; Spanos, Costas

    2005-05-01

    Pelgrom's model suggests that a spatial correlation structure is inherent in the physical properties of semiconductor devices; specifically, devices situated closely together will be subject to a higher degree of correlation than devices separated by larger distances. Since correlation of device gate length values caused by systematic variations in microlithographic processing is known to carry a significant impact on the variability of circuit performance, we attempt to extract and understand the nature of spatial correlation across an entire die. Based on exhaustive, full-wafer critical dimension measurements collected using electrical linewidth metrology for wafers processed in a standard 130nm lithography cell, we calculate a spatial correlation metric of gate length over a full-field range in both horizontal and vertical orientations. Using a rudimentary model fit to these results, we investigate the impact of correlation in the spatial distribution on the variability of circuit performance using a series of Monte Carlo analyses in HSPICE; it is confirmed that this correlation does indeed present a significant influence on performance variability. From the same dataset, we also extract both the across-wafer (AW) and within-field (WIF) contributions to systematic variation. We find that the spatial correlation model"s shape is strongly related to these two components of variation (both in magnitude as well as by spatial fingerprint). By artificially reducing each of these components of systematic variation-thereby simulating the effects of active, across-field process compensation-we show that spatial correlation is significantly reduced, nearly to zero. This implies that Pelgrom's model may not apply to die-scale separation distances, and that a more accurate correlation theory would combine Pelgrom's model over very short separation distances with a systematic variation model that captures variability over longer distances by means of non

  5. Paediatric CT protocol optimisation: a design of experiments to support the modelling and optimisation process.

    PubMed

    Rani, K; Jahnen, A; Noel, A; Wolf, D

    2015-07-01

    In the last decade, several studies have emphasised the need to understand and optimise the computed tomography (CT) procedures in order to reduce the radiation dose applied to paediatric patients. To evaluate the influence of the technical parameters on the radiation dose and the image quality, a statistical model has been developed using the design of experiments (DOE) method that has been successfully used in various fields (industry, biology and finance) applied to CT procedures for the abdomen of paediatric patients. A Box-Behnken DOE was used in this study. Three mathematical models (contrast-to-noise ratio, noise and CTDI vol) depending on three factors (tube current, tube voltage and level of iterative reconstruction) were developed and validated. They will serve as a basis for the development of a CT protocol optimisation model. PMID:25848116

  6. Enhancing the Design Process for Complex Space Systems through Early Integration of Risk and Variable-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri; Osburg, Jan

    2005-01-01

    An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.

  7. Computer modeling of high-pressure leaching process of nickel laterite by design of experiments and neural networks

    NASA Astrophysics Data System (ADS)

    Milivojevic, Milovan; Stopic, Srecko; Friedrich, Bernd; Stojanovic, Boban; Drndarevic, Dragoljub

    2012-07-01

    Due to the complex chemical composition of nickel ores, the requests for the decrease of production costs, and the increase of nickel extraction in the existing depletion of high-grade sulfide ores around the world, computer modeling of nickel ore leaching process became a need and a challenge. In this paper, the design of experiments (DOE) theory was used to determine the optimal experimental design plan matrix based on the D optimality criterion. In the high-pressure sulfuric acid leaching (HPSAL) process for nickel laterite in "Rudjinci" ore in Serbia, the temperature, the sulfuric acid to ore ratio, the stirring speed, and the leaching time as the predictor variables, and the degree of nickel extraction as the response have been considered. To model the process, the multiple linear regression (MLR) and response surface method (RSM), together with the two-level and four-factor full factorial central composite design (CCD) plan, were used. The proposed regression models have not been proven adequate. Therefore, the artificial neural network (ANN) approach with the same experimental plan was used in order to reduce operational costs, give a better modeling accuracy, and provide a more successful process optimization. The model is based on the multi-layer neural networks with the back-propagation (BP) learning algorithm and the bipolar sigmoid activation function.

  8. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  9. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  10. [Signal Processing Suite Design

    NASA Technical Reports Server (NTRS)

    Sahr, John D.; Mir, Hasan; Morabito, Andrew; Grossman, Matthew

    2003-01-01

    Our role in this project was to participate in the design of the signal processing suite to analyze plasma density measurements on board a small constellation (3 or 4) satellites in Low Earth Orbit. As we are new to space craft experiments, one of the challenges was to simply gain understanding of the quantity of data which would flow from the satellites, and possibly to interact with the design teams in generating optimal sampling patterns. For example, as the fleet of satellites were intended to fly through the same volume of space (displaced slightly in time and space), the bulk plasma structure should be common among the spacecraft. Therefore, an optimal, limited bandwidth data downlink would take advantage of this commonality. Also, motivated by techniques in ionospheric radar, we hoped to investigate the possibility of employing aperiodic sampling in order to gain access to a wider spatial spectrum without suffering aliasing in k-space.

  11. Hydroforming design and process advisor

    SciTech Connect

    Greer, J.T.; Ni, C.M.

    1996-10-10

    The hydroforming process involves hydraulically forming components by conforming them to the inner contours of a die. These contours can be complex and can often cause the material being formed to be stressed to rupture. Considerable process knowledge and materials modeling expertise is required to design hydroform dies and hydroformed parts that are readily formed without being overly stressed. For this CRADA, materials properties for steel tubes subjected to hydraulic stresses were collected; algorithms were developed which combined the materials properties data with process knowledge; and a user friendly graphical interface was utilized to make the system usable by a design engineer. A prototype hydroforming advisor was completed and delivered to GM. The technical objectives of the CRADA were met allowing for the development of an intelligent design systems, prediction of forming properties related to hydroforming, simulation and modeling of process execution, and design optimization. The design advisor allows a rapid and seamless approach to integration an otherwise enormous and onerous task of analysis and evaluation.

  12. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling

    PubMed Central

    F. Pradier, Melanie; J. R. Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners’ performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  13. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling.

    PubMed

    F Pradier, Melanie; J R Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners' performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  14. System design, development, and production process modeling: A versatile and powerful acquisition management decision support tool

    SciTech Connect

    Rafuse, H.E.

    1996-12-31

    A series of studies have been completed on the manufacturing operations of light, medium, and heavy tactical vehicle system producers to facilitate critical system acquisition resource decisions by the United States Army Program Executive Officer, Tactical Wheeled Vehicles. The principal programs were the Family of Medium Tactical Vehicles (FMTV) production programs at Stewart & Stevenson Services, Inc.; the heavy TWV production programs at the Oshkosh Truck Corporation in Oshkosh, Wisconsin; and the light TWV and 2.5 ton remanufacturing production programs at the AM General Corporation in South Bend, Indiana. Each contractor`s production scenarios were analyzed and modeled to accurately quantify the relationship between production rates and unit costs. Specific objectives included identifying (1) Minimum Sustaining Rates to support current and future budgetary requirements and resource programming for potential follow-on procurements, (2) thresholds where production rate changes significantly affect unit costs, and (3) critical production program factors and their impacts to production rate versus unit cost relationships. Two different techniques were utilized initially in conducting the analyses. One technique principally focused on collecting and analyzing applicable historical production program information, where available, to develop a statistical predictive model. A second and much more exhaustive technique focused on a detailed modeling of each contractor`s production processes, flows, and operations. A standard architecture of multiple linked functional modules was used for each process model. Using the standard architecture, the individual modules were tailored to specific contractor operations. Each model contains detailed information on manpower, burden rates, material, material price/quantity relationships, capital, manufacturing support, program management, and all related direct and indirect costs applicable to the production programs.

  15. Estimation of design space for an extrusion-spheronization process using response surface methodology and artificial neural network modelling.

    PubMed

    Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza

    2016-09-01

    The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented

  16. Model-based control structure design of a full-scale WWTP under the retrofitting process.

    PubMed

    Machado, V C; Lafuente, J; Baeza, J A

    2015-01-01

    The anoxic-oxic (A/O) municipal wastewater treatment plant (WWTP) of Manresa (Catalonia, Spain) was studied for a possible conversion to an anaerobic/anoxic/oxic (A2/O) configuration to promote enhanced biological phosphorus removal. The control structure had to be redesigned to satisfy the new necessity to control phosphorus concentration, besides ammonium and nitrate concentrations (main pollutant concentrations). Thereby, decentralized control structures with proportional-integral-derivative (PID) controllers and centralized control structures with model-predictive controllers (MPC) were designed and tested. All the designed control structures had their performance systematically tested regarding effluent quality and operating costs. The centralized control structure, A2/O-3-MPC, achieved the lowest operating costs with the best effluent quality using the A2/O plant configuration for the Manresa WWTP. The controlled variables used in this control structure were ammonium in the effluent, nitrate at the end of the anoxic zone and phosphate at the end of the anaerobic zone, while the manipulated variables were the internal and external recycle flow rates and the dissolved oxygen setpoint in the aerobic reactors. PMID:26038931

  17. Model Based Structural Evaluation & Design of Overpack Container for Bag-Buster Processing of TRU Waste Drums

    SciTech Connect

    D. T. Clark; A. S. Siahpush; G. L. Anderson

    2004-07-01

    This paper describes a materials and computational model based analysis utilized to design an engineered “overpack” container capable of maintaining structural integrity for confinement of transuranic wastes undergoing the cryo-vacuum stress based “Bag-Buster” process and satisfying DOT 7A waste package requirements. The engineered overpack is a key component of the “Ultra-BagBuster” process/system being commercially developed by UltraTech International for potential DOE applications to non-intrusively breach inner confinement layers (poly bags/packaging) within transuranic (TRU) waste drums. This system provides a lower cost/risk approach to mitigate hydrogen gas concentration buildup limitations on transport of high alpha activity organic transuranic wastes. Four evolving overpack design configurations and two materials (low carbon steel and 300 series stainless) were considered and evaluated using non-linear finite element model analyses of structural response. Properties comparisons show that 300-series stainless is required to provide assurance of ductility and structural integrity at both room and cryogenic temperatures. The overpack designs were analyzed for five accidental drop impact orientations onto an unyielding surface (dropped flat on bottom, bottom corner, side, top corner, and top). The first three design configurations failed the bottom and top corner drop orientations (flat bottom, top, and side plates breached or underwent material failure). The fourth design utilized a protruding rim-ring (skirt) below the overpack’s bottom plate and above the overpack’s lid plate to absorb much of the impact energy and maintained structural integrity under all accidental drop loads at both room and cryogenic temperature conditions. Selected drop testing of the final design will be required to confirm design performance.

  18. Modeling robot contour processes

    NASA Astrophysics Data System (ADS)

    Whitney, D. E.; Edsall, A. C.

    Robot contour processes include those with contact force like car body grinding or deburring of complex castings, as well as those with little or no contact force like inspection. This paper describes ways of characterizing, identifying, and estimating contours and robot trajectories. Contour and robot are modeled as stochastic processes in order to emphasize that both successive robot cycles and successive industrial workpieces are similar but not exactly the same. The stochastic models can be used to identify the state of a workpiece or process, or to design a filter to estimate workpiece, shape and robot position from robot-based measurements.

  19. A Process for Design Engineering

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2004-01-01

    The American Institute of Aeronautics and Astronautics Design Engineering Technical Committee has developed a draft Design Engineering Process with the participation of the technical community. This paper reviews similar engineering activities, lays out common terms for the life cycle and proposes a Design Engineering Process.

  20. Ethylene process design optimization

    SciTech Connect

    2001-09-01

    Integration of Advanced Technologies will Update Ethylene Plants. Nearly 93 million tons of ethylene are produced annually in chemical plants worldwide, using an energy intensive process that consumes 2.5 quadrillion Btu per year.

  1. Understanding the Process Model of Leadership: Follower Attribute Design and Assessment

    ERIC Educational Resources Information Center

    Antelo, Absael; Henderson, Richard L.; St. Clair, Norman

    2010-01-01

    Early leadership studies produced significant research findings that have helped differentiate between leader and follower personal attributes and their consequent behaviors (SEDL, 1992), but little attention was given to the follower's contribution to the leadership process. This study represents a continuation of research by Henderson, Antelo, &…

  2. ATOMIC-LEVEL MODELING OF CO2 DISPOSAL AS A CARBONATE MINERAL: A SYNERGETIC APPROACH TO OPTIMIZING REACTION PROCESS DESIGN

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; J.B. Adams

    2001-11-01

    Fossil fuels, especially coal, can support the energy demands of the world for centuries to come, if the environmental problems associated with CO{sub 2} emissions can be overcome. Permanent and safe methods for CO{sub 2} capture and disposal/storage need to be developed. Mineralization of stationary-source CO{sub 2} emissions as carbonates can provide such safe capture and long-term sequestration. Mg-rich lamellar hydroxide mineral carbonation is a leading process candidate, which generates the stable naturally occurring mineral magnesite (MgCO{sub 3}) and water. Key to process cost and viability are the carbonation reaction rate and its degree of completion. This process, which involves simultaneous dehydroxylation and carbonation is very promising, but far from optimized. In order to optimize the dehydroxylation/carbonation process, an atomic-level understanding of the mechanisms involved is needed. In this investigation Mg(OH){sub 2} was selected as a model Mg-rich lamellar hydrocide carbonation feedstock material due to its chemical and structural simplicity. Since Mg(OH){sub 2} dehydroxylation is intimately associated with the carbonation process, its mechanisms are also of direct interest in understanding and optimizing the process. The aim of the current innovative concepts project is to develop a specialized advanced computational methodology to complement the ongoing experimental inquiry of the atomic level processes involved in CO{sub 2} mineral sequestration. The ultimate goal is to integrate the insights provided by detailed predictive simulations with the data obtained from optical microscopy, FESEM, ion beam analysis, SIMS, TGA, Raman, XRD, and C and H elemental analysis. The modeling studies are specifically designed to enhance the synergism with, and complement the analysis of, existing mineral-CO{sub 2} reaction process studies being carried out under DOE UCR Grant DE-FG2698-FT40112. Direct contact between the simulations and the experimental

  3. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  4. Model parameters conditioning on regional hydrologic signatures for process-based design flood estimation in ungauged basins.

    NASA Astrophysics Data System (ADS)

    Biondi, Daniela; De Luca, Davide Luciano

    2015-04-01

    The use of rainfall-runoff models represents an alternative to statistical approaches (such as at-site or regional flood frequency analysis) for design flood estimation, and constitutes an answer to the increasing need for synthetic design hydrographs (SDHs) associated to a specific return period. However, the lack of streamflow observations and the consequent high uncertainty associated with parameter estimation, usually pose serious limitations to the use of process-based approaches in ungauged catchments, which in contrast represent the majority in practical applications. This work presents the application of a Bayesian procedure that, for a predefined rainfall-runoff model, allows for the assessment of posterior parameters distribution, using the limited and uncertain information available for the response of an ungauged catchment (Bulygina et al. 2009; 2011). The use of regional estimates of river flow statistics, interpreted as hydrological signatures that measure theoretically relevant system process behaviours (Gupta et al. 2008), within this framework represents a valuable option and has shown significant developments in recent literature to constrain the plausible model response and to reduce the uncertainty in ungauged basins. In this study we rely on the first three L-moments of annual streamflow maxima, for which regressions are available from previous studies (Biondi et al. 2012; Laio et al. 2011). The methodology was carried out for a catchment located in southern Italy, and used within a Monte Carlo scheme (MCs) considering both event-based and continuous simulation approaches for design flood estimation. The applied procedure offers promising perspectives to perform model calibration and uncertainty analysis in ungauged basins; moreover, in the context of design flood estimation, process-based methods coupled with MCs approach have the advantage of providing simulated floods uncertainty analysis that represents an asset in risk-based decision

  5. Book Processing Facility Design.

    ERIC Educational Resources Information Center

    Sheahan (Drake)-Stewart Dougall, Marketing and Physical Distribution Consultants, New York, NY.

    The Association of New York Libraries for Technical Services (ANYLTS) is established to develop and run a centralized book processing facility for the public library systems in New York State. ANYLTS plans to receive book orders from the 22 library systems, transmit orders to publishers, receive the volumes from the publishers, print and attach…

  6. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  7. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  8. Computational design of the basic dynamical processes of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Lamb, V. R.

    1977-01-01

    The 12-layer UCLA general circulation model encompassing troposphere and stratosphere (and superjacent 'sponge layer') is described. Prognostic variables are: surface pressure, horizontal velocity, temperature, water vapor and ozone in each layer, planetary boundary layer (PBL) depth, temperature, moisture and momentum discontinuities at PBL top, ground temperature and water storage, and mass of snow on ground. Selection of space finite-difference schemes for homogeneous incompressible flow, with/without a free surface, nonlinear two-dimensional nondivergent flow, enstrophy conserving schemes, momentum advection schemes, vertical and horizontal difference schemes, and time differencing schemes are discussed.

  9. Reengineering the Project Design Process

    NASA Technical Reports Server (NTRS)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  10. GAX absorption cycle design process

    SciTech Connect

    Priedeman, D.K.; Christensen, R.N.

    1999-07-01

    This paper presents an absorption system design process that relies on computer simulations that are validated by experimental findings. An ammonia-water absorption heat pump cycle at 3 refrigeration tons (RT) and chillers at 3.3 RT and 5 RT (10.5 kW, 11.6 kW, and 17.6 kW) were initially modeled and then built and tested. The experimental results were used to calibrate both the cycle simulation and the component simulations, yielding computer design routines that could accurately predict component and cycle performance. Each system was a generator-absorber heat exchange (GAX) cycle, and all were sized for residential and light commercial use, where very little absorption equipment is currently used. The specific findings of the 5 RT (17.6 kW) chiller are presented. Modeling incorporated a heat loss from the gas-fired generator and pressure drops in both the evaporator and absorber. Simulation results and experimental findings agreed closely and validated the modeling method and simulation software.

  11. Biological neural networks as model systems for designing future parallel processing computers

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    One of the more interesting debates of the present day centers on whether human intelligence can be simulated by computer. The author works under the premise that neurons individually are not smart at all. Rather, they are physical units which are impinged upon continuously by other matter that influences the direction of voltage shifts across the units membranes. It is only the action of a great many neurons, billions in the case of the human nervous system, that intelligent behavior emerges. What is required to understand even the simplest neural system is painstaking analysis, bit by bit, of the architecture and the physiological functioning of its various parts. The biological neural network studied, the vestibular utricular and saccular maculas of the inner ear, are among the most simple of the mammalian neural networks to understand and model. While there is still a long way to go to understand even this most simple neural network in sufficient detail for extrapolation to computers and robots, a start was made. Moreover, the insights obtained and the technologies developed help advance the understanding of the more complex neural networks that underlie human intelligence.

  12. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  13. Ligand modeling and design

    SciTech Connect

    Hay, B.P.

    1997-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used in the cost-effective removal of specific radionuclides from nuclear waste streams. Organic ligands with metal ion specificity are critical components in the development of solvent extraction and ion exchange processes that are highly selective for targeted radionuclides. The traditional approach to the development of such ligands involves lengthy programs of organic synthesis and testing, which in the absence of reliable methods for screening compounds before synthesis, results in wasted research effort. The author`s approach breaks down and simplifies this costly process with the aid of computer-based molecular modeling techniques. Commercial software for organic molecular modeling is being configured to examine the interactions between organic ligands and metal ions, yielding an inexpensive, commercially or readily available computational tool that can be used to predict the structures and energies of ligand-metal complexes. Users will be able to correlate the large body of existing experimental data on structure, solution binding affinity, and metal ion selectivity to develop structural design criteria. These criteria will provide a basis for selecting ligands that can be implemented in separations technologies through collaboration with other DOE national laboratories and private industry. The initial focus will be to select ether-based ligands that can be applied to the recovery and concentration of the alkali and alkaline earth metal ions including cesium, strontium, and radium.

  14. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  15. Optimization process in helicopter design

    NASA Technical Reports Server (NTRS)

    Logan, A. H.; Banerjee, D.

    1984-01-01

    In optimizing a helicopter configuration, Hughes Helicopters uses a program called Computer Aided Sizing of Helicopters (CASH), written and updated over the past ten years, and used as an important part of the preliminary design process of the AH-64. First, measures of effectiveness must be supplied to define the mission characteristics of the helicopter to be designed. Then CASH allows the designer to rapidly and automatically develop the basic size of the helicopter (or other rotorcraft) for the given mission. This enables the designer and management to assess the various tradeoffs and to quickly determine the optimum configuration.

  16. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  17. The Process Design Courses at Pennsylvania: Impact of Process Simulators.

    ERIC Educational Resources Information Center

    Seider, Warren D.

    1984-01-01

    Describes the use and impact of process design simulators in process design courses. Discusses topics covered, texts used, computer design simulations, and how they are integrated into the process survey course as well as in plant design projects. (JM)

  18. DESIGNING ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES

    EPA Science Inventory

    The design of a chemical process involves many aspects: from profitability, flexibility and reliability to safety to the environment. While each of these is important, in this work, the focus will be on profitability and the environment. Key to the study of these aspects is the ...

  19. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  20. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  1. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  2. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    NASA Technical Reports Server (NTRS)

    Gillette, W. B.; Turner, M. J.; Southall, J. W.; Whitener, P. C.; Kowalik, J. S.

    1973-01-01

    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system.

  3. Optimal design activated sludge process by means of multi-objective optimization: case study in Benchmark Simulation Model 1 (BSM1).

    PubMed

    Chen, Wenliang; Yao, Chonghua; Lu, Xiwu

    2014-01-01

    Optimal design of activated sludge process (ASP) using multi-objective optimization was studied, and a benchmark process in Benchmark Simulation Model 1 (BSM1) was taken as a target process. The objectives of the study were to achieve four indexes of percentage of effluent violation (PEV), overall cost index (OCI), total volume and total suspended solids, making up four cases for comparative analysis. Models were solved by the non-dominated sorting genetic algorithm in MATLAB. Results show that: ineffective solutions can be rejected by adding constraints, and newly added objectives can affect the relationship between the existing objectives; taking Pareto solutions as process parameters, the performance indexes of PEV and OCI can be improved more than with the default process parameters of BSM1, especially for N removal and resistance against dynamic NH4(+)-N in influent. The results indicate that multi-objective optimization is a useful method for optimal design ASP. PMID:24845320

  4. Numerical simulations supporting the process design of ring rolling processes

    NASA Astrophysics Data System (ADS)

    Jenkouk, V.; Hirt, G.; Seitz, J.

    2013-05-01

    In conventional Finite Element Analysis (FEA) of radial-axial ring rolling (RAR) the motions of all tools are usually defined prior to simulation in the preprocessing step. However, the real process holds up to 8 degrees of freedom (DOF) that are controlled by industrial control systems according to actual sensor values and preselected control strategies. Since the histories of the motions are unknown before the experiment and are dependent on sensor data, the conventional FEA cannot represent the process before experiment. In order to enable the usage of FEA in the process design stage, this approach integrates the industrially applied control algorithms of the real process including all relevant sensors and actuators into the FE model of ring rolling. Additionally, the process design of a novel process 'the axial profiling', in which a profiled roll is used for rolling axially profiled rings, is supported by FEA. Using this approach suitable control strategies can be tested in virtual environment before processing.

  5. Molecular modeling of directed self-assembly of block copolymers: Fundamental studies of processing conditions and evolutionary pattern design

    NASA Astrophysics Data System (ADS)

    Khaira, Gurdaman Singh

    Rapid progress in the semi-conductor industry has pushed for smaller feature sizes on integrated electronic circuits. Current photo-lithographic techniques for nanofabrication have reached their technical limit and are problematic when printing features small enough to meet future industrial requirements. "Bottom-up'' techniques, such as the directed self-assembly (DSA) of block copolymers (BCP), are the primary contenders to compliment current "top-down'' photo-lithography ones. For industrial requirements, the defect density from DSA needs to be less than 1 defect per 10 cm by 10 cm. Knowledge of both material synthesis and the thermodynamics of the self-assembly process are required before optimal operating conditions can be found to produce results adequate for industry. The work present in this thesis is divided into three chapters, each discussing various aspects of DSA as studied via a molecular model that contains the essential physics of BCP self-assembly. Though there are various types of guiding fields that can be used to direct BCPs over large wafer areas with minimum defects, this study focuses only on chemically patterned substrates. The first chapter addresses optimal pattern design by describing a framework where molecular simulations of various complexities are coupled with an advanced optimization technique to find a pattern that directs a target morphology. It demonstrates the first ever study where BCP self-assembly on a patterned substrate is optimized using a three-dimensional description of the block-copolymers. For problems pertaining to DSA, the methodology is shown to converge much faster than the traditional random search approach. The second chapter discusses the metrology of BCP thin films using TEM tomography and X-ray scattering techniques, such as CDSAXS and GISAXS. X-ray scattering has the advantage of being able to quickly probe the average structure of BCP morphologies over large wafer areas; however, deducing the BCP morphology

  6. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  7. Engineering design: A cognitive process approach

    NASA Astrophysics Data System (ADS)

    Strimel, Greg Joseph

    The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the

  8. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  9. Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Ramos Alarcon, Rafael

    This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the

  10. Application of central composite design and artificial neural network in modeling of reactive blue 21 dye removal by photo-ozonation process.

    PubMed

    Mehrizad, Ali; Gharbani, Parvin

    2016-01-01

    The present study deals with use of central composite design (CCD) and artificial neural network (ANN) in modeling and optimization of reactive blue 21 (RB21) removal from aqueous media under photo-ozonation process. Four effective operational parameters (including: initial concentration of RB21, O(3) concentration, UV light intensity and reaction time) were chosen and the experiments were designed by CCD based on response surface methodology (RSM). The obtained results from the CCD model were used in modeling the process by ANN. Under optimum condition (O(3) concentration of 3.95 mg L(-1), UV intensity of 20.5 W m(-2), reaction time of 7.77 min and initial dye concentration of 40.21 mg L(-1)), RB21 removal efficiency reached to up 98.88%. A topology of ANN with a three-layer consisting of four input neurons, 14 hidden neurons and one output neuron was designed. The relative significance of each major factor was calculated based on the connection weights of the ANN model. Dye and ozone concentrations were the most important variables in the photo-ozonation of RB21, followed by reaction time and UV light intensity. The comparison of predicted values by CCD and ANN with experimental results showed that both methods were highly efficient in the modeling of the process. PMID:27386996

  11. NANEX: Process design and optimization.

    PubMed

    Baumgartner, Ramona; Matić, Josip; Schrank, Simone; Laske, Stephan; Khinast, Johannes; Roblegg, Eva

    2016-06-15

    Previously, we introduced a one-step nano-extrusion (NANEX) process for transferring aqueous nano-suspensions into solid formulations directly in the liquid phase. Nano-suspensions were fed into molten polymers via a side-feeding device and excess water was eliminated via devolatilization. However, the drug content in nano-suspensions is restricted to 30 % (w/w), and obtaining sufficiently high drug loadings in the final formulation requires the processing of high water amounts and thus a fundamental process understanding. To this end, we investigated four polymers with different physicochemical characteristics (Kollidon(®) VA64, Eudragit(®) E PO, HPMCAS and PEG 20000) in terms of their maximum water uptake/removal capacity. Process parameters as throughput and screw speed were adapted and their effect on the mean residence time and filling degree was studied. Additionally, one-dimensional discretization modeling was performed to examine the complex interactions between the screw geometry and the process parameters during water addition/removal. It was established that polymers with a certain water miscibility/solubility can be manufactured via NANEX. Long residence times of the molten polymer in the extruder and low filling degrees in the degassing zone favored the addition/removal of significant amounts of water. The residual moisture content in the final extrudates was comparable to that of extrudates manufactured without water. PMID:27090153

  12. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  13. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  14. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  15. Hardware-software-co-design of parallel and distributed systems using a behavioural programming and multi-process model with high-level synthesis

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2011-05-01

    A new design methodology for parallel and distributed embedded systems is presented using the behavioural hardware compiler ConPro providing an imperative programming model based on concurrently communicating sequential processes (CSP) with an extensive set of interprocess-communication primitives and guarded atomic actions. The programming language and the compiler-based synthesis process enables the design of constrained power- and resourceaware embedded systems with pure Register-Transfer-Logic (RTL) efficiently mapped to FPGA and ASIC technologies. Concurrency is modelled explicitly on control- and datapath level. Additionally, concurrency on data-path level can be automatically explored and optimized by different schedulers. The CSP programming model can be synthesized to hardware (SoC) and software (C,ML) models and targets. A common source for both hardware and software implementation with identical functional behaviour is used. Processes and objects of the entire design can be distributed on different hardware and software platforms, for example, several FPGA components and software executed on several microprocessors, providing a parallel and distributed system. Intersystem-, interprocess-, and object communication is automatically implemented with serial links, not visible on programming level. The presented design methodology has the benefit of high modularity, freedom of choice of target technologies, and system architecture. Algorithms can be well matched to and distributed on different suitable execution platforms and implementation technologies, using a unique programming model, providing a balance of concurrency and resource complexity. An extended case study of a communication protocol used in high-density sensor-actuator networks should demonstrate and compare the design of a hardware and software target. The communication protocol is suited for high-density intra-and interchip networks.

  16. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  17. Modeling of pulverized coal combustion processes in a vortex furnace of improved design. Part 1: Flow aerodynamics in a vortex furnace

    NASA Astrophysics Data System (ADS)

    Krasinsky, D. V.; Salomatov, V. V.; Anufriev, I. S.; Sharypov, O. V.; Shadrin, E. Yu.; Anikin, Yu. A.

    2015-02-01

    Some results of the complex experimental and numerical study of aerodynamics and transfer processes in a vortex furnace, whose design was improved via the distributed tangential injection of fuel-air flows through the upper and lower burners, were presented. The experimental study of the aerodynamic characteristics of a spatial turbulent flow was performed on the isothermal laboratory model (at a scale of 1 : 20) of an improved vortex furnace using a laser Doppler measurement system. The comparison of experimental data with the results of the numerical modeling of an isothermal flow for the same laboratory furnace model demonstrated their agreement to be acceptable for engineering practice.

  18. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  19. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  20. The watershed-scale optimized and rearranged landscape design (WORLD) model and local biomass processing depots for sustainable biofuel production: Integrated life cycle assessments

    SciTech Connect

    Eranki, Pragnya L.; Manowitz, David H.; Bals, Bryan D.; Izaurralde, Roberto C.; Kim, Seungdo; Dale, Bruce E.

    2013-07-23

    An array of feedstock is being evaluated as potential raw material for cellulosic biofuel production. Thorough assessments are required in regional landscape settings before these feedstocks can be cultivated and sustainable management practices can be implemented. On the processing side, a potential solution to the logistical challenges of large biorefi neries is provided by a network of distributed processing facilities called local biomass processing depots. A large-scale cellulosic ethanol industry is likely to emerge soon in the United States. We have the opportunity to influence the sustainability of this emerging industry. The watershed-scale optimized and rearranged landscape design (WORLD) model estimates land allocations for different cellulosic feedstocks at biorefinery scale without displacing current animal nutrition requirements. This model also incorporates a network of the aforementioned depots. An integrated life cycle assessment is then conducted over the unified system of optimized feedstock production, processing, and associated transport operations to evaluate net energy yields (NEYs) and environmental impacts.

  1. Design and validation of an intelligent patient monitoring and alarm system based on a fuzzy logic process model.

    PubMed

    Becker, K; Thull, B; Käsmacher-Leidinger, H; Stemmer, J; Rau, G; Kalff, G; Zimmermann, H J

    1997-09-01

    The process of patient care performed by an anaesthesiologist during high invasive surgery requires fundamental knowledge of the physiologic processes and a long standing experience in patient management to cope with the inter-individual variability of the patients. Biomedical engineering research improves the patient monitoring task by providing technical devices to measure a large number of a patient's vital parameters. These measurements improve the safety of the patient during the surgical procedure, because pathological states can be recognised earlier, but may also lead to an increased cognitive load of the physician. In order to reduce cognitive strain and to support intra-operative monitoring for the anaesthesiologist an intelligent patient monitoring and alarm system has been proposed and implemented which evaluates a patient's haemodynamic state on the basis of a current vital parameter constellation with a knowledge-based approach. In this paper general design aspects and evaluation of the intelligent patient monitoring and alarm system in the operating theatre are described. The validation of the inference engine of the intelligent patient monitoring and alarm system was performed in two steps. Firstly, the knowledge base was validated with real patient data which was acquired online in the operating theatre. Secondly, a research prototype of the whole system was implemented in the operating theatre. In the first step, the anaesthetists were asked to enter a state variable evaluation before a drug application or any other intervention on the patient into a recording system. These state variable evaluations were compared to those generated by the intelligent alarm system on the same vital parameter constellations. Altogether 641 state variable evaluations were entered by six different physicians. In total, the sensitivity of alarm recognition is 99.3%, the specificity is 66% and the predictability is 45%. The second step was performed using a research

  2. Design of intelligent controllers for exothermal processes

    NASA Astrophysics Data System (ADS)

    Nagarajan, Ramachandran; Yaacob, Sazali

    2001-10-01

    Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

  3. Design Thinking in Elementary Students' Collaborative Lamp Designing Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    Design and Technology education is potentially a rich environment for successful learning, if the management of the whole design process is emphasised, and students' design thinking is promoted. The aim of the present study was to unfold the collaborative design process of one team of elementary students, in order to understand their multimodal…

  4. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  5. Models of the Reading Process

    PubMed Central

    Rayner, Keith; Reichle, Erik D.

    2010-01-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a “model of reading” when talking about only one aspect of the reading process (for example, models of word identification are often referred to as “models of reading”). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers’ eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized. PMID:21170142

  6. Using an Analogical Thinking Model as an Instructional Tool to Improve Student Cognitive Ability in Architecture Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua

    2013-01-01

    Lack of creativity is a problem often plaguing students from design-related departments. Therefore, this study is intended to incorporate analogical thinking in the education of architecture design to enhance students' learning and their future career performance. First, this study explores the three aspects of architecture design curricula,…

  7. ESS Accelerator Cryoplant Process Design

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Arnold, P.; Hees, W.; Hildenbeutel, J.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility being built with extensive international collaboration in Lund, Sweden. The ESS accelerator will deliver protons with 5 MW of power to the target at 2.0 GeV, with a nominal current of 62.5 mA. The superconducting part of the accelerator is about 300 meters long and contains 43 cryomodules. The ESS accelerator cryoplant (ACCP) will provide the cooling for the cryomodules and the cryogenic distribution system that delivers the helium to the cryomodules. The ACCP will cover three cryogenic circuits: Bath cooling for the cavities at 2 K, the thermal shields at around 40 K and the power couplers thermalisation with 4.5 K forced helium cooling. The open competitive bid for the ACCP took place in 2014 with Linde Kryotechnik AG being selected as the vendor. This paper summarizes the progress in the ACCP development and engineering. Current status including final cooling requirements, preliminary process design, system configuration, machine concept and layout, main parameters and features, solution for the acceptance tests, exergy analysis and efficiency is presented.

  8. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  9. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  10. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  11. "From the Formal to the Innovative": The Use of Case Studies and Sustainable Projects in Developing a Design Process Model for Educating Product/Industrial Designers

    ERIC Educational Resources Information Center

    Oakes, G. L.; Felton, A. J.; Garner, K. B.

    2006-01-01

    The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…

  12. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process. PMID:15323110

  13. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  14. Design Science Research for Business Process Design: Organizational Transition at Intersport Sweden

    NASA Astrophysics Data System (ADS)

    Lind, Mikael; Rudmark, Daniel; Seigerroth, Ulf

    Business processes need to be aligned with business strategies. This paper elaborates on experiences from a business process design effort in an action research project performed at Intersport Sweden. The purpose with this project was to create a solid base for taking the retail chain Intersport into a new organizational state where the new process design is aligned with strategic goals. Although business process modeling is concerned with creating artifacts, traditionally information systems design science research has had little impact on research on business process models. In this paper, we address the question of how design science research can contribute to business process design. Three heuristic guidelines for creating organizational commitment and strategic alignment in process design are presented. The guidelines are derived from the successful actions taken in the research project. The development of these guidelines is used as a basis to reflect upon the contribution of design science research to business process design.

  15. 76 FR 70368 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... USDA Secretarial disaster designation process. FSA proposes to simplify the processes and delegate them... rule would update the language to reflect current practice. The current regulations require that a... proposes to simplify the USDA Secretarial designation process from a six-step process to a two-step...

  16. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  17. The 9-Step Problem Design Process for Problem-Based Learning: Application of the 3C3R Model

    ERIC Educational Resources Information Center

    Hung, Woei

    2009-01-01

    The design of problems is crucial for the effectiveness of problem-based learning (PBL). Research has shown that PBL problems have not always been effective. Ineffective PBL problems could affect whether students acquire sufficient domain knowledge, activate appropriate prior knowledge, and properly direct their own learning. This paper builds on…

  18. Graphic Design in Libraries: A Conceptual Process

    ERIC Educational Resources Information Center

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  19. Instructional Design Processes and Traditional Colleges

    ERIC Educational Resources Information Center

    Vasser, Nichole

    2010-01-01

    Traditional colleges who have implemented distance education programs would benefit from using instructional design processes to develop their courses. Instructional design processes provide the framework for designing and delivering quality online learning programs in a highly-competitive educational market. Traditional college leaders play a…

  20. 77 FR 41248 - Disaster Designation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... designation regulations to provide for changes in the designation process (76 FR 70368-70374). In general, that rule proposed to simplify the disaster designation process and to delegate the authority for... 759.6 has also been changed from the proposed rule to remove proposed language referring to a...

  1. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  2. Modeling Production Plant Forming Processes

    SciTech Connect

    Rhee, M; Becker, R; Couch, R; Li, M

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaboration with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.

  3. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  4. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  5. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes. PMID:23039255

  6. The Architectural and Interior Design Planning Process.

    ERIC Educational Resources Information Center

    Cohen, Elaine

    1994-01-01

    Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…

  7. A Model of Continuous Improvement in High Schools: A Process for Research, Innovation Design, Implementation, and Scale

    ERIC Educational Resources Information Center

    Cohen-Vogel, Lora; Cannata, Marisa; Rutledge, Stacey A.; Socol, Allison Rose

    2016-01-01

    This chapter describes a model for continuous improvement that guides the work of the National Center on Scaling Up Effective Schools, or NCSU. NCSU is a research and development center funded by the Institute for Education Sciences, the research arm of the United States Department of Education. At the core of the Center's work is an innovative…

  8. Forging process design for risk reduction

    NASA Astrophysics Data System (ADS)

    Mao, Yongning

    In this dissertation, forging process design has been investigated with the primary concern on risk reduction. Different forged components have been studied, especially those ones that could cause catastrophic loss if failure occurs. As an effective modeling methodology, finite element analysis is applied extensively in this work. Three examples, titanium compressor disk, superalloy turbine disk, and titanium hip prosthesis, have been discussed to demonstrate this approach. Discrete defects such as hard alpha anomalies are known to cause disastrous failure if they are present in those stress critical components. In this research, hard-alpha inclusion movement during forging of titanium compressor disk is studied by finite element analysis. By combining the results from Finite Element Method (FEM), regression modeling and Monte Carlo simulation, it is shown that changing the forging path is able to mitigate the failure risk of the components during the service. The second example goes with a turbine disk made of superalloy IN 718. The effect of forging on microstructure is the main consideration in this study. Microstructure defines the as-forged disk properties. Considering specific forging conditions, preform has its own effect on the microstructure. Through a sensitivity study it is found that forging temperature and speed have significant influence on the microstructure. In order to choose the processing parameters to optimize the microstructure, the dependence of microstructure on die speed and temperature is thoroughly studied using design of numerical experiments. For various desired goals, optimal solutions are determined. The narrow processing window of titanium alloy makes the isothermal forging a preferred way to produce forged parts without forging defects. However, the cost of isothermal forging (dies at the same temperature as the workpiece) limits its wide application. In this research, it has been demonstrated that with proper process design, the die

  9. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  10. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  11. Model for Vaccine Design by Prediction of B-Epitopes of IEDB Given Perturbations in Peptide Sequence, In Vivo Process, Experimental Techniques, and Source or Host Organisms

    PubMed Central

    González-Díaz, Humberto; Pérez-Montoto, Lázaro G.; Ubeira, Florencio M.

    2014-01-01

    Perturbation methods add variation terms to a known experimental solution of one problem to approach a solution for a related problem without known exact solution. One problem of this type in immunology is the prediction of the possible action of epitope of one peptide after a perturbation or variation in the structure of a known peptide and/or other boundary conditions (host organism, biological process, and experimental assay). However, to the best of our knowledge, there are no reports of general-purpose perturbation models to solve this problem. In a recent work, we introduced a new quantitative structure-property relationship theory for the study of perturbations in complex biomolecular systems. In this work, we developed the first model able to classify more than 200,000 cases of perturbations with accuracy, sensitivity, and specificity >90% both in training and validation series. The perturbations include structural changes in >50000 peptides determined in experimental assays with boundary conditions involving >500 source organisms, >50 host organisms, >10 biological process, and >30 experimental techniques. The model may be useful for the prediction of new epitopes or the optimization of known peptides towards computational vaccine design. PMID:24741624

  12. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  13. Ligand modeling and design

    SciTech Connect

    Hay, B.

    1996-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used tin applications for the cost-effective removal of specific radionuclides from nuclear waste streams.

  14. Process Design Manual for Nitrogen Control.

    ERIC Educational Resources Information Center

    Parker, Denny S.; And Others

    This manual presents theoretical and process design criteria for the implementation of nitrogen control technology in municipal wastewater treatment facilities. Design concepts are emphasized through examination of data from full-scale and pilot installations. Design data are included on biological nitrification and denitrification, breakpoint…

  15. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  16. Hydrocarbon Processing`s process design and optimization `96

    SciTech Connect

    1996-06-01

    This paper compiles information on hydrocarbon processes, describing the application, objective, economics, commercial installations, and licensor. Processes include: alkylation, ammonia, catalytic reformer, crude fractionator, crude unit, vacuum unit, dehydration, delayed coker, distillation, ethylene furnace, FCCU, polymerization, gas sweetening, hydrocracking, hydrogen, hydrotreating (naphtha, distillate, and resid desulfurization), natural gas processing, olefins, polyethylene terephthalate, refinery, styrene, sulfur recovery, and VCM furnace.

  17. Solid model design simplification

    SciTech Connect

    Ames, A.L.; Rivera, J.J.; Webb, A.J.; Hensinger, D.M.

    1997-12-01

    This paper documents an investigation of approaches to improving the quality of Pro/Engineer-created solid model data for use by downstream applications. The investigation identified a number of sources of problems caused by deficiencies in Pro/Engineer`s geometric engine, and developed prototype software capable of detecting many of these problems and guiding users towards simplified, useable models. The prototype software was tested using Sandia production solid models, and provided significant leverage in attacking the simplification problem.

  18. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-01

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates. PMID:27088667

  19. WORKSHOP ON ENVIRONMENTALLY CONSCIOUS CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    To encourage the consideration of environmental issues during chemical process design, the USEPA has developed techniques and software tools to evaluate the relative environmental impact of a chemical process. These techniques and tools aid in the risk management process by focus...

  20. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  1. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    PubMed Central

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Background Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Results In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA) was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher) followed by a double one (clinician and managers of services) in the implementation phase. Conclusion The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements. PMID:20216954

  2. Launch Vehicle Design Process Description and Training Formulation

    NASA Technical Reports Server (NTRS)

    Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke

    1999-01-01

    A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture

  3. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  4. Chemical Process Design: An Integrated Teaching Approach.

    ERIC Educational Resources Information Center

    Debelak, Kenneth A.; Roth, John A.

    1982-01-01

    Reviews a one-semester senior plant design/laboratory course, focusing on course structure, student projects, laboratory assignments, and course evaluation. Includes discussion of laboratory exercises related to process waste water and sludge. (SK)

  5. Molecular Thermodynamics for Chemical Process Design

    ERIC Educational Resources Information Center

    Prausnitz, J. M.

    1976-01-01

    Discusses that aspect of thermodynamics which is particularly important in chemical process design: the calculation of the equilibrium properties of fluid mixtures, especially as required in phase-separation operations. (MLH)

  6. MIDAS: a framework for integrated design and manufacturing process

    NASA Astrophysics Data System (ADS)

    Chung, Moon Jung; Kwon, Patrick; Pentland, Brian

    2000-10-01

    In this paper, we present a development of a framework for managing design and manufacturing process in a distributed environment. The framework offers the following facilities: (1) to represent the complicated engineering design processes (2) to coordinate design activities and execute the process in a distributed environment and (3) to support a collaborative design by sharing data and processes. In this paper, the process flow graphs, which consist in tasks and the corresponding input and output data, are used to depict the engineering design process on a process modeling browser. The engineering activities in the represented processes can be executed in a distributed environment through the cockpit of the framework. The communication among the related engineers to support a collaborative design is made on the collaborative design browser with SML underlying data structures of representing process information to make the framework extensible and platform- independent. The formal and flexible approach of the proposed framework to integrate the engineering design processes can be also effectively applied to coordinate concurrent engineering activities in a distributed environment.

  7. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  8. The Engineering Process in Construction & Design

    ERIC Educational Resources Information Center

    Stoner, Melissa A.; Stuby, Kristin T.; Szczepanski, Susan

    2013-01-01

    Recent research suggests that high-impact activities in science and math classes promote positive attitudinal shifts in students. By implementing high-impact activities, such as designing a school and a skate park, mathematical thinking can be linked to the engineering design process. This hands-on approach, when possible, to demonstrate or…

  9. Distributed Group Design Process: Lessons Learned.

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ganesan, Radha

    A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…

  10. Design of Nanomaterial Synthesis by Aerosol Processes

    PubMed Central

    Buesser, Beat; Pratsinis, Sotiris E.

    2013-01-01

    Aerosol synthesis of materials is a vibrant field of particle technology and chemical reaction engineering. Examples include the manufacture of carbon blacks, fumed SiO2, pigmentary TiO2, ZnO vulcanizing catalysts, filamentary Ni, and optical fibers, materials that impact transportation, construction, pharmaceuticals, energy, and communications. Parallel to this, development of novel, scalable aerosol processes has enabled synthesis of new functional nanomaterials (e.g., catalysts, biomaterials, electroceramics) and devices (e.g., gas sensors). This review provides an access point for engineers to the multiscale design of aerosol reactors for the synthesis of nanomaterials using continuum, mesoscale, molecular dynamics, and quantum mechanics models spanning 10 and 15 orders of magnitude in length and time, respectively. Key design features are the rapid chemistry; the high particle concentrations but low volume fractions; the attainment of a self-preserving particle size distribution by coagulation; the ratio of the characteristic times of coagulation and sintering, which controls the extent of particle aggregation; and the narrowing of the aggregate primary particle size distribution by sintering. PMID:22468598

  11. Information Network Model Query Processing

    NASA Astrophysics Data System (ADS)

    Song, Xiaopu

    Information Networking Model (INM) [31] is a novel database model for real world objects and relationships management. It naturally and directly supports various kinds of static and dynamic relationships between objects. In INM, objects are networked through various natural and complex relationships. INM Query Language (INM-QL) [30] is designed to explore such information network, retrieve information about schema, instance, their attributes, relationships, and context-dependent information, and process query results in the user specified form. INM database management system has been implemented using Berkeley DB, and it supports INM-QL. This thesis is mainly focused on the implementation of the subsystem that is able to effectively and efficiently process INM-QL. The subsystem provides a lexical and syntactical analyzer of INM-QL, and it is able to choose appropriate evaluation strategies and index mechanism to process queries in INM-QL without the user's intervention. It also uses intermediate result structure to hold intermediate query result and other helping structures to reduce complexity of query processing.

  12. INTEGRATED FISCHER TROPSCH MODULAR PROCESS MODEL

    SciTech Connect

    Donna Post Guillen; Richard Boardman; Anastasia M. Gribik; Rick A. Wood; Robert A. Carrington

    2007-12-01

    With declining petroleum reserves, increased world demand, and unstable politics in some of the world’s richest oil producing regions, the capability for the U.S. to produce synthetic liquid fuels from domestic resources is critical to national security and economic stability. Coal, biomass and other carbonaceous materials can be converted to liquid fuels using several conversion processes. The leading candidate for large-scale conversion of coal to liquid fuels is the Fischer Tropsch (FT) process. Process configuration, component selection, and performance are interrelated and dependent on feed characteristics. This paper outlines a flexible modular approach to model an integrated FT process that utilizes a library of key component models, supporting kinetic data and materials and transport properties allowing rapid development of custom integrated plant models. The modular construction will permit rapid assessment of alternative designs and feed stocks. The modeling approach consists of three thrust areas, or “strands” – model/module development, integration of the model elements into an end to end integrated system model, and utilization of the model for plant design. Strand 1, model/module development, entails identifying, developing, and assembling a library of codes, user blocks, and data for FT process unit operations for a custom feedstock and plant description. Strand 2, integration development, provides the framework for linking these component and subsystem models to form an integrated FT plant simulation. Strand 3, plant design, includes testing and validation of the comprehensive model and performing design evaluation analyses.

  13. Designing Competitive Service Models

    NASA Astrophysics Data System (ADS)

    Martinez, Veronica; Turner, Trevor

    The explosives developed in Europe in the late nineteenth and early twentieth ­century by the famous Swede and patron of the world peace prize, Alfred Nobel, were extremely durable and, apart from the introduction of the electric detonator, have remained in use with minor modifications for almost a century (Fig. 5.1a). In the 1970s a new invention started a process of change that has transformed the explosives business from being a supplier of products to a provider of a service. Survival very much depended on the agility of ICI Explosives UK, hereinafter referred to as "ICI Explosives," in adapting to the new competitive environment. Manufacturing excellence was not a solution. Innovative thinking was required to sustain the ­business as changes in technology reduced the complexity that had ­protected the business from serious competition for over a century.

  14. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  15. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  16. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  17. Chemical kinetics and oil shale process design

    SciTech Connect

    Burnham, A.K.

    1993-07-01

    Oil shale processes are reviewed with the goal of showing how chemical kinetics influences the design and operation of different processes for different types of oil shale. Reaction kinetics are presented for organic pyrolysis, carbon combustion, carbonate decomposition, and sulfur and nitrogen reactions.

  18. 32nm design rule and process exploration flow

    NASA Astrophysics Data System (ADS)

    Zhang, Yunqiang; Cobb, Jonathan; Yang, Amy; Li, Ji; Lucas, Kevin; Sethi, Satyendra

    2008-10-01

    Semiconductor manufacturers spend hundreds of millions of dollars and years of development time to create a new manufacturing process and to design frontrunner products to work on the new process. A considerable percentage of this large investment is aimed at producing the process design rules and related lithography technology to pattern the new products successfully. Significant additional cost and time is needed in both process and design development if the design rules or lithography strategy must be modified. Therefore, early and accurate prediction of both process design rules and lithography options is necessary for minimizing cost and timing in semiconductor development. This paper describes a methodology to determine the optimum design rules and lithography conditions with high accuracy early in the development lifecycle. We present results from the 32nm logic node but the methodology can be extended to the 22nm node or any other node. This work involves: automated generation of extended realistic logic test layouts utilizing programmed teststructures for a variety of design rules; determining a range of optical illumination and process conditions to test for each critical design layer; using these illumination conditions to create a extrapolatable process window OPC model which is matched to rigorous TCAD lithography focus-exposure full chemically amplified resist models; creating reticle enhancement technique (RET) recipes which are flexible enough to be used over a variety of design rule and illumination conditions; OPC recipes which are flexible enough to be used over a variety of design rule and illumination conditions; and OPC verification to find, categorize and report all patterning issues found in the different design and illumination variations. In this work we describe in detail the individual steps in the methodology, and provide results of its use for 32nm node design rule and process optimization.

  19. Instructional Design and Directed Cognitive Processing.

    ERIC Educational Resources Information Center

    Bovy, Ruth Colvin

    This paper argues that the information processing model provides a promising basis on which to build a comprehensive theory of instruction. Characteristics of the major information processing constructs are outlined including attention, encoding and rehearsal, working memory, long term memory, retrieval, and metacognitive processes, and a unifying…

  20. Electromagnetic modeling in accelerator designs

    SciTech Connect

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described.

  1. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  2. Corn stover semi-mechanistic enzymatic hydrolysis model with tight parameter confidence intervals for model-based process design and optimization.

    PubMed

    Scott, Felipe; Li, Muyang; Williams, Daniel L; Conejeros, Raúl; Hodge, David B; Aroca, Germán

    2015-02-01

    Uncertainty associated to the estimated values of the parameters in a model is a key piece of information for decision makers and model users. However, this information is typically not reported or the confidence intervals are too large to be useful. A semi-mechanistic model for the enzymatic saccharification of dilute acid pretreated corn stover is proposed in this work, the model is a modification of an existing one providing a statistically significant improved fit towards a set of experimental data that includes varying initial solid loadings (10-25% w/w) and the use of the pretreatment liquor and washed solids with or without supplementation of key inhibitors. A subset of 8 out of 17 parameters was identified, showing sufficiently tight confidence intervals to be used in uncertainty propagation and model analysis, without requiring interval truncation via expert judgment. PMID:25496946

  3. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  4. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  5. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  6. Macrocell design for concurrent signal processing

    SciTech Connect

    Pope, S.P.; Brodersen, R.W.

    1983-01-01

    Macrocells serve as subsystems at the top level of the hardware design hierarchy. The authors present the macrocell design technique as applied to the implementation of real-time, sampled-data signal processing functions. The design of such circuits is particularly challenging due to the computationally intensive nature of signal-processing algorithms and the constraints of real-time operation. The most efficient designs make use of a high degree of concurrency-a property facilitated by the microcell approach. Two circuit projects whose development resulted largely from the macrocell methodology described are used as examples throughout the report: a linear-predictive vocoder circuit, and a front-end filter-bank chip for a speech recognition system. Both are monolithic multiprocessor implementations: the lpc vocoder circuit contains three processors, the filter-bank chip two processors. 10 references.

  7. Defining process design space for monoclonal antibody cell culture.

    PubMed

    Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A

    2010-08-15

    The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified. PMID:20589669

  8. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  9. COMPUTER ASSISTED PRELIMINARY DESIGN FOR DRINKING WATER TREATMENT PROCESS SYSTEMS

    EPA Science Inventory

    The purpose of the study was to develop an interactive computer program to aid the design engineer in evaluating the performance and cost for any proposed drinking water treatment system consisting of individual unit processes. The 25 unit process models currently in the program ...

  10. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  11. Design, control and in situ visualization of gas nitriding processes.

    PubMed

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  12. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  13. Teaching sustainable design: A collaborative process

    SciTech Connect

    Theis, C.C.

    1997-12-31

    This paper describes a collaborative educational experience in the Schools of Architecture and Landscape Architecture at Louisiana State University. During the Fall Semester of 1996 an upper-level architectural design studio worked with a peer group of landscape architecture students on the design of a master plan for an environmentally sensitive residential development on Cat Island, a barrier island located approximately eight miles south of Gulfport, Mississippi. This paper presents the methodology and results of the project, describes the collaborative process, and assesses both the viability of the design solutions and the value of the educational experience.

  14. Protein designs in HP models

    NASA Astrophysics Data System (ADS)

    Gupta, Arvind; Khodabakhshi, Alireza Hadj; Maňuch, Ján; Rafiey, Arash; Stacho, Ladislav

    2009-07-01

    The inverse protein folding problem is that of designing an amino acid sequence which folds into a prescribed shape. This problem arises in drug design where a particular structure is necessary to ensure proper protein-protein interactions and could have applications in nanotechnology. A major challenge in designing proteins with native folds that attain a specific shape is to avoid proteins that have multiple native folds (unstable proteins). In this technical note we present our results on protein designs in the variant of Hydrophobic-Polar (HP) model introduced by Dill [6] on 2D square lattice. The HP model distinguishes only polar and hydrophobic monomers and only counts the number of hydrophobic contacts in the energy function. To achieve better stability of our designs we use the Hydrophobic-Polar-Cysteine (HPC) model which distinguishes the third type of monomers called "cysteines" and incorporates also the disulfid bridges (SS-bridges) into the energy function. We present stable designs in 2D square lattice and 3D hexagonal prism lattice in the HPC model.

  15. Process design for Al backside contacts

    SciTech Connect

    Chalfoun, L.L.; Kimerling, L.C.

    1995-08-01

    It is known that properly alloyed aluminum backside contacts can improve silicon solar cell efficiency. To use this knowledge to fullest advantage, we have studied the gettering process that occurs during contact formation and the microstructure of the contact and backside junction region. With an understanding of the alloying step, optimized fabrication processes can be designed. To study gettering, single crystal silicon wafers were coated with aluminim on both sides and subjected to heat treatments. Results are described.

  16. Global optimization of bilinear engineering design models

    SciTech Connect

    Grossmann, I.; Quesada, I.

    1994-12-31

    Recently Quesada and Grossmann have proposed a global optimization algorithm for solving NLP problems involving linear fractional and bilinear terms. This model has been motivated by a number of applications in process design. The proposed method relies on the derivation of a convex NLP underestimator problem that is used within a spatial branch and bound search. This paper explores the use of alternative bounding approximations for constructing the underestimator problem. These are applied in the global optimization of problems arising in different engineering areas and for which different relaxations are proposed depending on the mathematical structure of the models. These relaxations include linear and nonlinear underestimator problems. Reformulations that generate additional estimator functions are also employed. Examples from process design, structural design, portfolio investment and layout design are presented.

  17. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  18. Dynamic Process Simulation for Analysis and Design.

    ERIC Educational Resources Information Center

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  19. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  20. Generic Model Host System Design

    SciTech Connect

    Chu, Chungming; Wu, Juhao; Qiang, Ji; Shen, Guobao; /Brookhaven

    2012-06-22

    There are many simulation codes for accelerator modelling; each one has some strength but not all. A platform which can host multiple modelling tools would be ideal for various purposes. The model platform along with infrastructure support can be used not only for online applications but also for offline purposes. Collaboration is formed for the effort of providing such a platform. In order to achieve such a platform, a set of common physics data structure has to be set. Application Programming Interface (API) for physics applications should also be defined within a model data provider. A preliminary platform design and prototype is discussed.

  1. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  2. Using scoping as a design process

    SciTech Connect

    Mulvihill, P.R. ); Jacobs, P. )

    1998-07-01

    Skillful use of the scoping phase of environment assessment (EA) is critical in cases involving a wide diversity of stakeholders and perspectives. Scoping can exert a strong influence in shaping a relevant impact assessment and increasing the probability of a process that satisfies stakeholders. This article explores key challenges facing scoping processes conducted in highly pluralistic settings. Elements of a notable case study--the scoping process conducted in 1992 for the proposed Great Whale Hydroelectric project in Northern Quebec--are discussed to illustrate innovative approaches. When used as a design process, scoping can ensure that EA reflects the different value sets and cultures that are at play, particularly where diverse knowledge systems and ways of describing environmental components and impacts exist. As it sets the stage for subsequent steps in the EA process, scoping needs to be a sufficiently broad umbrella that accommodates diverse approaches to identifying, classifying, and assessing impacts.

  3. Product and Process Improvement Using Mixture-Process Variable Designs and Robust Optimization Techniques

    SciTech Connect

    Sahni, Narinder S.; Piepel, Gregory F.; Naes, Tormod

    2009-04-01

    The quality of an industrial product depends on the raw material proportions and the process variable levels, both of which need to be taken into account in designing a product. This article presents a case study from the food industry in which both kinds of variables were studied by combining a constrained mixture experiment design and a central composite process variable design. Based on the natural structure of the situation, a split-plot experiment was designed and models involving the raw material proportions and process variable levels (separately and combined) were fitted. Combined models were used to study: (i) the robustness of the process to variations in raw material proportions, and (ii) the robustness of the raw material recipes with respect to fluctuations in the process variable levels. Further, the expected variability in the robust settings was studied using the bootstrap.

  4. Component Latent Trait Models for Test Design.

    ERIC Educational Resources Information Center

    Embretson, Susan Whitely

    Latent trait models are presented that can be used for test design in the context of a theory about the variables that underlie task performance. Examples of methods for decomposing and testing hypotheses about the theoretical variables in task performance are given. The methods can be used to determine the processing components that are involved…

  5. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  6. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  7. A biofilm model for engineering design.

    PubMed

    Takács, I; Bye, C M; Chapman, K; Dold, P L; Fairlamb, P M; Jones, R M

    2007-01-01

    A biofilm model is presented for process engineering purposes--wastewater treatment plant design, upgrade and optimisation. The model belongs in the 1D dynamic layered biofilm model category, with modifications that allow it to be used with one parameter set for a large range of process situations. The biofilm model is integrated with a general activated sludge/anaerobic digestion model combined with a chemical equilibrium, precipitation and pH module. This allows the model to simulate the complex interactions that occur in the aerobic, anoxic and anaerobic layers of the biofilm. The model has been tested and is shown to match a variety of design guidelines, as well as experimental results from batch testing and full-scale plant operation. Both moving bed bioreactors (MBBR) and integrated fixed film activated sludge (IFAS) systems were simulated using the same model and parameter set. A new steady-state solver generates fast solutions and allows interactive design work with the complex model. PMID:17547002

  8. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world. PMID:11323249

  9. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  10. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  11. Information Flow in the Launch Vehicle Design/Analysis Process

    NASA Technical Reports Server (NTRS)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  12. Composting process design criteria. II. Detention time

    SciTech Connect

    Haug, R.T.

    1986-09-01

    Attention has always been directed to detention time as a criteria for design and operation of composting systems. Perhaps this is a logical outgrowth of work on liquid phase systems, where detention time is a fundamental parameter of design. Unlike liquid phase systems, however, the interpretation of detention time and actual values required for design have not been universally accepted in the case of composting. As a case in point, most compost systems incorporate facilities for curing the compost product. However, curing often is considered after the fact or as an add on with little relationship to the first stage, high-rate phase, whether reactor (in-vessel), static pile, or windrow. Design criteria for curing and the relationships between the first-stage, high-rate and second-stage, curing phases of a composting system have been unclear. In Part 2 of this paper, the concepts of hydraulic retention time (HRT) and solids residence time (SRT) are applied to the composting process. Definitions and design criteria for each are proposed. Based on these criteria, the first and second-stages can be designed and integrated into a complete composting system.

  13. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    SciTech Connect

    Martinez-Canales, Monica L.; Heaphy, Robert; Gramacy, Robert B.; Taddy, Matt; Chiesa, Michael L.; Thomas, Stephen W.; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Trucano, Timothy Guy; Gray, Genetha Anne

    2006-11-01

    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  14. Incorporating manufacturability constraints into the design process of heterogeneous objects

    NASA Astrophysics Data System (ADS)

    Hu, Yuna; Blouin, Vincent Y.; Fadel, Georges M.

    2004-11-01

    Rapid prototyping (RP) technology, such as Laser Engineering Net Shaping (LENSTM), can be used to fabricate heterogeneous objects with gradient variations in material composition. These objects are generally characterized by enhanced functional performance. Past research on the design of such objects has focused on representation, modeling, and functional performance. However, the inherent constraints in RP processes, such as system capability and processing time, lead to heterogeneous objects that may not meet the designer's original intent. To overcome this situation, the research presented in this paper focuses on the identification and implementation of manufacturing constraints into the design process. A node-based finite element modeling technique is used for the representation and analysis and the multicriteria design problem corresponds to finding the nodal material compositions that minimize structural weight and maximize thermal performance. The optimizer used in this research is a real-valued Evolutionary Strategies (ES), which is well suited for this type of multi-modal problem. Two limitations of the LENS manufacturing process, which have an impact on the design process, are identified and implemented. One of them is related to the manufacturing time, which is considered as an additional criterion to be minimized in the design problem for a preselected tool path. A brake disc rotor made of two materials, aluminum for lightweight and steel for superior thermal characteristics, is used to illustrate the tradeoff between manufacturability and functionality.

  15. Multiwavelet design for cardiac signal processing.

    PubMed

    Peelers, R L M; Karel, J M H; Westra, R L; Haddad, S A P; Serdijn, W A

    2006-01-01

    An approach for designing multiwavelets is introduced, for use in cardiac signal processing. The parameterization of the class of multiwavelets is in terms of associated FIR polyphase all-pass filters. Orthogonality and a balanced vanishing moment of order 1 are built into the parameterization. An optimization criterion is developed to associate the wavelets with different meaningful segments of a signal. This approach is demonstrated on the simultaneous detection of QRS-complexes and T-peaks in ECG signals. PMID:17946917

  16. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  17. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare. PMID:22925789

  18. Liberating Expression: A Freehand Approach to Business Process Modeling

    NASA Astrophysics Data System (ADS)

    Mangano, Nicolas; Sukaviriya, Noi

    Tools that support business process modeling are designed for experienced users to draw a process with precision and professional appearance. These tools are not conducive to sketching quick business design ideas.This demo proposal presents Inkus, a non-intrusive business process sketching tool which allows freehand sketches of process ideas and slowly brings the users to the required common business vocabulary. Our goal is to help unleash creativity in business designers and enrich the design process with values beyond drawing.

  19. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  20. Designing and encoding models for synthetic biology

    PubMed Central

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-01-01

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology ‘loop’. PMID:19364720

  1. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons. PMID:21064164

  2. Thinking and the Design Process. DIUL-RR-8414.

    ERIC Educational Resources Information Center

    Moulin, Bernard

    Designed to focus attention on the design process in such computer science activities as information systems design, database design, and expert systems design, this paper examines three main phases of the design process: understanding the context of the problem, identifying the problem, and finding a solution. The processes that these phases…

  3. DECISION SUPPORT SYSTEM TO ENHANCE AND ENCOURAGE SUSTAINABLE CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    There is an opportunity to minimize the potential environmental impacts (PEIs) of industrial chemical processes by providing process designers with timely data nad models elucidating environmentally favorable design options. The second generation of the Waste Reduction (WAR) algo...

  4. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  5. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  6. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  7. PRELIMINARY DESIGN FOR DRINKING WATER TREATMENT PROCESS SYSTEMS

    EPA Science Inventory

    A computer model has been developed for use in estimating the performance and associated costs of proposed and existing water supply systems. Design procedures and cost-estimating relationships for 25 unit processes that can be used for drinking water treatment are contained with...

  8. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  9. A stress index model for balloon design

    NASA Technical Reports Server (NTRS)

    Smith, I. S.

    1987-01-01

    A NASA stress index model, SINDEX, is discussed which establishes the relative stress magnitudes along a balloon gore as a function of altitude. Application of the model to a data base of over 550 ballon flights demonstrates the effectiveness of the method. The results show a strong correlation between stress levels, failure rates, and the point of maximum stress coinciding with the observed failure locations. It is suggested that the model may be used during the balloon design process to lower the levels of stress in the balloon.

  10. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  11. Parametric design methodology for chemical processes using a simulator

    SciTech Connect

    Diwekar, U.M.; Rubin, E.S. )

    1994-02-01

    Parameter design is a method popularized by the Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust in the face of uncontrollable variations. At the design stage, the goal of parameter design is to identify design settings that make the product performance less sensitive to the effects of manufacturing and environmental variations and deterioration. Because parameter design reduces performance variation by reducing the influence of the sources of variation rather than by controlling them, it is a cost-effective technique for improving quality. A recent study on the application of parameter design methodology for chemical processes reported that the use of Taguchi's method was not justified and a method based on Monte Carlo simulation combined with optimization was shown to be more effective. However, this method is computationally intensive as a large number of samples are necessary to achieve the given accuracy. Additionally, determination of the number of sample runs required is based on experimentation due to a lack of systematic sampling methods. In an attempt to overcome these problems, the use of a stochastic modeling capability combined with an optimizer is presented in this paper. The objective is that of providing an effective means for application of parameter design methodologies to chemical processes using the ASPEN simulator. This implementation not only presents a generalized tool for use by chemical engineers at large but also provides systematic estimates of the number of sample runs required to attain the specified accuracy. The stochastic model employs the technique of Latin hypercube sampling instead of the traditional Monte Carlo technique and hence has a great potential to reduce the required number of samples. The methodology is illustrated via an example problem of designing a chemical process.

  12. Rapid Modeling, Assembly and Simulation in Design Optimization

    NASA Technical Reports Server (NTRS)

    Housner, Jerry

    1997-01-01

    A new capability for design is reviewed. This capability provides for rapid assembly of detail finite element models early in the design process where costs are most effectively impacted. This creates an engineering environment which enables comprehensive analysis and design optimization early in the design process. Graphical interactive computing makes it possible for the engineer to interact with the design while performing comprehensive design studies. This rapid assembly capability is enabled by the use of Interface Technology, to couple independently created models which can be archived and made accessible to the designer. Results are presented to demonstrate the capability.

  13. Bates solar industrial process-steam application: preliminary design review

    SciTech Connect

    Not Available

    1980-01-07

    The design is analyzed for a parabolic trough solar process heat system for a cardboard corrugation fabrication facility in Texas. The program is briefly reviewed, including an analysis of the plant and process. The performance modeling for the system is discussed, and the solar system structural design, collector subsystem, heat transport and distribution subsystem are analyzed. The selection of the heat transfer fluid, and ullage and fluid maintenance are discussed, and the master control system and data acquisition system are described. Testing of environmental degradation of materials is briefly discussed. A brief preliminary cost analysis is included. (LEW)

  14. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  15. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  16. Innovative machine designs for radiation processing

    NASA Astrophysics Data System (ADS)

    Vroom, David

    2007-12-01

    In the 1990s Raychem Corporation established a program to investigate the commercialization of several promising applications involving the combined use of its core competencies in materials science, radiation chemistry and e-beam radiation technology. The applications investigated included those that would extend Raychem's well known heat recoverable polymer and wire and cable product lines as well as new potential applications such as remediation of contaminated aqueous streams. A central part of the program was the development of new accelerator technology designed to improve quality, lower processing costs and efficiently process conformable materials such at liquids. A major emphasis with this new irradiation technology was to look at the accelerator and product handling systems as one integrated, not as two complimentary systems.

  17. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  18. Designer cell signal processing circuits for biotechnology.

    PubMed

    Bradley, Robert W; Wang, Baojun

    2015-12-25

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  19. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  20. Metrics for Business Process Models

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  1. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  2. Understanding backward design to strengthen curricular models.

    PubMed

    Emory, Jan

    2014-01-01

    Nurse educators have responded to the call for transformation in education. Challenges remain in planning curricular implementation to facilitate understanding of essential content for student success on licensure examinations and in professional practice. The conceptual framework Backward Design (BD) can support and guide curriculum decisions. Using BD principles in conjunction with educational models can strengthen and improve curricula. This article defines and describes the BD process, and identifies reported benefits for nursing education. PMID:24743175

  3. Learning from the Pros: How Experienced Designers Translate Instructional Design Models into Practice

    ERIC Educational Resources Information Center

    Ertmer, Peggy A.; York, Cindy S.; Gedik, Nuray

    2009-01-01

    Understanding how experienced designers approach complex design problems provides new perspectives on how they translate instructional design (ID) models and processes into practice. In this article, the authors describe the results of a study in which 16 "seasoned" designers shared compelling stories from practice that offered insights into their…

  4. Computer aided microbial safety design of food processes.

    PubMed

    Schellekens, M; Martens, T; Roberts, T A; Mackey, B M; Nicolaï, B M; Van Impe, J F; De Baerdemaeker, J

    1994-12-01

    To reduce the time required for product development, to avoid expensive experimental tests, and to quantify safety risks for fresh products and the consequence of processing there is a growing interest in computer aided food process design. This paper discusses the application of hybrid object-oriented and rule-based expert system technology to represent the data and knowledge of microbial experts and food engineers. Finite element models for heat transfer calculation routines, microbial growth and inactivation models and texture kinetics are combined with food composition data, thermophysical properties, process steps and expert knowledge on type and quantity of microbial contamination. A prototype system has been developed to evaluate changes in food composition, process steps and process parameters on microbiological safety and textual quality of foods. PMID:7703003

  5. Virtual Welded - Joint Design Integrating Advanced Materials and Processing Technology

    SciTech Connect

    Yang, Zhishang; Ludewig, Howard W.; Babu, S. Suresh

    2005-06-30

    Virtual Welede-Joint Design, a systematic modeling approach, has been developed in this project to predict the relationship of welding process, microstructure, properties, residual stress, and the ultimate weld fatique strength. This systematic modeling approach was applied in the welding of high strength steel. A special welding wire was developed in this project to introduce compressive residual stress at weld toe. The results from both modeling and experiments demonstrated that more than 10x fatique life improvement can be acheived in high strength steel welds by the combination of compressive residual stress from the special welding wire and the desired weld bead shape from a unique welding process. The results indicate a technology breakthrough in the design of lightweight and high fatique performance welded structures using high strength steels.

  6. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  7. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  8. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  9. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  10. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…