Sample records for process design based

  1. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  2. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  3. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  4. Applying Constructivist and Objectivist Learning Theories in the Design of a Web-based Course: Implications for Practice.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    2001-01-01

    Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…

  5. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  6. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  7. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  8. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  9. Towards a Web-Based Handbook of Generic, Process-Oriented Learning Designs

    ERIC Educational Resources Information Center

    Marjanovic, Olivera

    2005-01-01

    Process-oriented learning designs are innovative learning activities that include a set of inter-related learning tasks and are generic (could be used across disciplines). An example includes a problem-solving process widely used in problem-based learning today. Most of the existing process-oriented learning designs are not documented, let alone…

  10. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  11. Intelligent design of permanent magnet synchronous motor based on CBR

    NASA Astrophysics Data System (ADS)

    Li, Cong; Fan, Beibei

    2018-05-01

    Aiming at many problems in the design process of Permanent magnet synchronous motor (PMSM), such as the complexity of design process, the over reliance on designers' experience and the lack of accumulation and inheritance of design knowledge, a design method of PMSM Based on CBR is proposed in order to solve those problems. In this paper, case-based reasoning (CBR) methods of cases similarity calculation is proposed for reasoning suitable initial scheme. This method could help designers, by referencing previous design cases, to make a conceptual PMSM solution quickly. The case retain process gives the system self-enrich function which will improve the design ability of the system with the continuous use of the system.

  12. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  13. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  14. The research on construction and application of machining process knowledge base

    NASA Astrophysics Data System (ADS)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  15. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  16. A Co-Design Process Microanalysis: Stages and Facilitators of an Inquiry-Based and Technology-Enhanced Learning Scenario

    ERIC Educational Resources Information Center

    Barbera, Elena; Garcia, Iolanda; Fuertes-Alpiste, Marc

    2017-01-01

    This paper presents a case study of the co-design process for an online course on Sustainable Development (Degree in Tourism) involving the teacher, two students, and the project researchers. The co-design process was founded on an inquiry-based and technology-enhanced model that takes shape in a set of design principles. The research had two main…

  17. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  18. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  19. Polymer based tunneling sensor

    NASA Technical Reports Server (NTRS)

    Wang, Jing (Inventor); Zhao, Yongjun (Inventor); Cui, Tianhong (Inventor)

    2006-01-01

    A process for fabricating a polymer based circuit by the following steps. A mold of a design is formed through a lithography process. The design is transferred to a polymer substrate through a hot embossing process. A metal layer is then deposited over at least part of said design and at least one electrical lead is connected to said metal layer.

  20. Seven-Step Problem-Based Learning in an Interaction Design Course

    ERIC Educational Resources Information Center

    Schultz, Nette; Christensen, Hans Peter

    2004-01-01

    The objective in this paper is the implementation of the highly structured seven-step problem-based learning (PBL) procedure as part of the learning process in a human-computer interaction (HCI) design course at the Technical University of Denmark, taking into account the common learning processes in PBL and the interaction design process. These…

  1. A novel surrogate-based approach for optimal design of electromagnetic-based circuits

    NASA Astrophysics Data System (ADS)

    Hassan, Abdel-Karim S. O.; Mohamed, Ahmed S. A.; Rabie, Azza A.; Etman, Ahmed S.

    2016-02-01

    A new geometric design centring approach for optimal design of central processing unit-intensive electromagnetic (EM)-based circuits is introduced. The approach uses norms related to the probability distribution of the circuit parameters to find distances from a point to the feasible region boundaries by solving nonlinear optimization problems. Based on these normed distances, the design centring problem is formulated as a max-min optimization problem. A convergent iterative boundary search technique is exploited to find the normed distances. To alleviate the computation cost associated with the EM-based circuits design cycle, space-mapping (SM) surrogates are used to create a sequence of iteratively updated feasible region approximations. In each SM feasible region approximation, the centring process using normed distances is implemented, leading to a better centre point. The process is repeated until a final design centre is attained. Practical examples are given to show the effectiveness of the new design centring method for EM-based circuits.

  2. [Comparison of two algorithms for development of design space-overlapping method and probability-based method].

    PubMed

    Shao, Jing-Yuan; Qu, Hai-Bin; Gong, Xing-Chu

    2018-05-01

    In this work, two algorithms (overlapping method and the probability-based method) for design space calculation were compared by using the data collected from extraction process of Codonopsis Radix as an example. In the probability-based method, experimental error was simulated to calculate the probability of reaching the standard. The effects of several parameters on the calculated design space were studied, including simulation number, step length, and the acceptable probability threshold. For the extraction process of Codonopsis Radix, 10 000 times of simulation and 0.02 for the calculation step length can lead to a satisfactory design space. In general, the overlapping method is easy to understand, and can be realized by several kinds of commercial software without coding programs, but the reliability of the process evaluation indexes when operating in the design space is not indicated. Probability-based method is complex in calculation, but can provide the reliability to ensure that the process indexes can reach the standard within the acceptable probability threshold. In addition, there is no probability mutation in the edge of design space by probability-based method. Therefore, probability-based method is recommended for design space calculation. Copyright© by the Chinese Pharmaceutical Association.

  3. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  5. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  6. The Design and Management of an Organisation's Lifelong Learning Curriculum

    ERIC Educational Resources Information Center

    Dealtry, Richard

    2009-01-01

    Purpose: The purpose of this paper is to examine the successful design and management of high performance work-based lifelong learning processes. Design: The paper summarises the process management practices and contextual parameters that are being applied in the successful design and management of high performance work based lifelong learning…

  7. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  8. Investigation of model-based physical design restrictions (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl

    2005-05-01

    As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.

  9. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  10. Design Expert's Participation in Elementary Students' Collaborative Design Process

    ERIC Educational Resources Information Center

    Kangas, Kaiju; Seitamaa-Hakkarainen, Pirita; Hakkarainen, Kai

    2013-01-01

    The main goal of the present study was to provide insights into how disciplinary expertise might be infused into Design and Technology classrooms and how authentic processes based on professional design practices might be constructed. We describe elementary students' collaborative lamp designing process, where the leadership was provided by a…

  11. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  12. Consensus-Based Course Design and Implementation of Constructive Alignment Theory in a Power System Analysis Course

    ERIC Educational Resources Information Center

    Vanfretti, Luigi; Farrokhabadi, Mostafa

    2015-01-01

    This article presents the implementation of the constructive alignment theory (CAT) in a power system analysis course through a consensus-based course design process. The consensus-based design process involves both the instructor and graduate-level students and it aims to develop the CAT framework in a holistic manner with the goal of including…

  13. Research on animation design of growing plant based on 3D MAX technology

    NASA Astrophysics Data System (ADS)

    Chen, Yineng; Fang, Kui; Bu, Weiqiong; Zhang, Xiaoling; Lei, Menglong

    In view of virtual plant has practical demands on quality, image and degree of realism animation in growing process of plant, this thesis design the animation based on mechanism and regularity of plant growth, and propose the design method based on 3D MAX technology. After repeated analysis and testing, it is concluded that there are modeling, rendering, animation fabrication and other key technologies in the animation design process. Based on this, designers can subdivid the animation into seed germination animation, plant growth prophase animation, catagen animation, later animation and blossom animation. This paper compounds the animation of these five stages by VP window to realize the completed 3D animation. Experimental result shows that the animation can realized rapid, visual and realistic simulatation the plant growth process.

  14. Development of Conceptual Design Support Tool Founded on Formalization of Conceptual Design Process for Regenerative Life Support Systems

    NASA Astrophysics Data System (ADS)

    Miyajima, Hiroyuki; Yuhara, Naohiro

    Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.

  15. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  16. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  17. Investigation of Low-Reynolds-Number Rocket Nozzle Design Using PNS-Based Optimization Procedure

    NASA Technical Reports Server (NTRS)

    Hussaini, M. Moin; Korte, John J.

    1996-01-01

    An optimization approach to rocket nozzle design, based on computational fluid dynamics (CFD) methodology, is investigated for low-Reynolds-number cases. This study is undertaken to determine the benefits of this approach over those of classical design processes such as Rao's method. A CFD-based optimization procedure, using the parabolized Navier-Stokes (PNS) equations, is used to design conical and contoured axisymmetric nozzles. The advantage of this procedure is that it accounts for viscosity during the design process; other processes make an approximated boundary-layer correction after an inviscid design is created. Results showed significant improvement in the nozzle thrust coefficient over that of the baseline case; however, the unusual nozzle design necessitates further investigation of the accuracy of the PNS equations for modeling expanding flows with thick laminar boundary layers.

  18. A Meta-Analysis and Review of Holistic Face Processing

    PubMed Central

    Richler, Jennifer J.; Gauthier, Isabel

    2014-01-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, two different measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the two designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs, and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly three times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the first sections of our review—the complete design—and outline outstanding research questions in that new context. PMID:24956123

  19. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  20. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  1. Manufacturing process and material selection in concurrent collaborative design of MEMS devices

    NASA Astrophysics Data System (ADS)

    Zha, Xuan F.; Du, H.

    2003-09-01

    In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.

  2. A meta-analysis and review of holistic face processing.

    PubMed

    Richler, Jennifer J; Gauthier, Isabel

    2014-09-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Model-based design of experiments for cellular processes.

    PubMed

    Chakrabarty, Ankush; Buzzard, Gregery T; Rundell, Ann E

    2013-01-01

    Model-based design of experiments (MBDOE) assists in the planning of highly effective and efficient experiments. Although the foundations of this field are well-established, the application of these techniques to understand cellular processes is a fertile and rapidly advancing area as the community seeks to understand ever more complex cellular processes and systems. This review discusses the MBDOE paradigm along with applications and challenges within the context of cellular processes and systems. It also provides a brief tutorial on Fisher information matrix (FIM)-based and Bayesian experiment design methods along with an overview of existing software packages and computational advances that support MBDOE application and adoption within the Systems Biology community. As cell-based products and biologics progress into the commercial sector, it is anticipated that MBDOE will become an essential practice for design, quality control, and production. Copyright © 2013 Wiley Periodicals, Inc.

  4. Development of a Design Supporting System for Nano-Materials based on a Framework for Integrated Knowledge of Functioning-Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Tarumi, Shinya; Kozaki, Kouji; Kitamura, Yoshinobu; Mizoguchi, Riichiro

    In the recent materials research, much work aims at realization of ``functional materials'' by changing structure and/or manufacturing process with nanotechnology. However, knowledge about the relationship among function, structure and manufacturing process is not well organized. So, material designers have to consider a lot of things at the same time. It would be very helpful for them to support their design process by a computer system. In this article, we discuss a conceptual design supporting system for nano-materials. Firstly, we consider a framework for representing functional structures and manufacturing processes of nano-materials with relationships among them. We expand our former framework for representing functional knowledge based on our investigation through discussion with experts of nano-materials. The extended framework has two features: 1) it represents functional structures and manufacturing processes comprehensively, 2) it expresses parameters of function and ways with their dependencies because they are important for material design. Next, we describe a conceptual design support system we developed based on the framework with its functionalities. Lastly, we evaluate the utility of our system in terms of functionality for design supports. For this purpose, we tried to represent two real examples of material design. And then we did an evaluation experiment on conceptual design of material using our system with the collaboration of domain experts.

  5. Design or "Design"--Envisioning a Future Design Education

    ERIC Educational Resources Information Center

    Sless, David

    2012-01-01

    Challenging the common grand vision of Design, this article considers "design" as a humble re-forming process based on evidence to substantiate its results. The designer is likened to a tinker who respects previous iterations of a design and seeks to retain what is useful while improving its performance. A design process is offered,…

  6. On the Development of a Computing Infrastructure that Facilitates IPPD from a Decision-Based Design Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.

  7. Process-based organization design and hospital efficiency.

    PubMed

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  8. [Application of quality by design in granulation process for Ginkgo leaf tablet (Ⅲ): process control strategy based on design space].

    PubMed

    Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  9. A Framework for the Development of Automatic DFA Method to Minimize the Number of Components and Assembly Reorientations

    NASA Astrophysics Data System (ADS)

    Alfadhlani; Samadhi, T. M. A. Ari; Ma’ruf, Anas; Setiasyah Toha, Isa

    2018-03-01

    Assembly is a part of manufacturing processes that must be considered at the product design stage. Design for Assembly (DFA) is a method to evaluate product design in order to make it simpler, easier and quicker to assemble, so that assembly cost is reduced. This article discusses a framework for developing a computer-based DFA method. The method is expected to aid product designer to extract data, evaluate assembly process, and provide recommendation for the product design improvement. These three things are desirable to be performed without interactive process or user intervention, so product design evaluation process could be done automatically. Input for the proposed framework is a 3D solid engineering drawing. Product design evaluation is performed by: minimizing the number of components; generating assembly sequence alternatives; selecting the best assembly sequence based on the minimum number of assembly reorientations; and providing suggestion for design improvement.

  10. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.

  11. Optimizing the availability of a buffered industrial process

    DOEpatents

    Martz, Jr., Harry F.; Hamada, Michael S.; Koehler, Arthur J.; Berg, Eric C.

    2004-08-24

    A computer-implemented process determines optimum configuration parameters for a buffered industrial process. A population size is initialized by randomly selecting a first set of design and operation values associated with subsystems and buffers of the buffered industrial process to form a set of operating parameters for each member of the population. An availability discrete event simulation (ADES) is performed on each member of the population to determine the product-based availability of each member. A new population is formed having members with a second set of design and operation values related to the first set of design and operation values through a genetic algorithm and the product-based availability determined by the ADES. Subsequent population members are then determined by iterating the genetic algorithm with product-based availability determined by ADES to form improved design and operation values from which the configuration parameters are selected for the buffered industrial process.

  12. Information Design: A New Approach to Teaching Technical Writing Service Courses

    ERIC Educational Resources Information Center

    McKee, Candie DeLane

    2012-01-01

    This study used a needs assessment, process analysis, process design, and textbook design to develop a new process and new textbook, based on Cargile-Cook's layered literacies, Quesenbery's five qualities of usability, and Carliner's information design theories, for use in technical writing service learning courses. The needs assessment was based…

  13. Using pattern enumeration to accelerate process development and ramp yield

    NASA Astrophysics Data System (ADS)

    Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua

    2016-03-01

    During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.

  14. Onboard FPGA-based SAR processing for future spaceborne systems

    NASA Technical Reports Server (NTRS)

    Le, Charles; Chan, Samuel; Cheng, Frank; Fang, Winston; Fischman, Mark; Hensley, Scott; Johnson, Robert; Jourdan, Michael; Marina, Miguel; Parham, Bruce; hide

    2004-01-01

    We present a real-time high-performance and fault-tolerant FPGA-based hardware architecture for the processing of synthetic aperture radar (SAR) images in future spaceborne system. In particular, we will discuss the integrated design approach, from top-level algorithm specifications and system requirements, design methodology, functional verification and performance validation, down to hardware design and implementation.

  15. The Relationships between Problem Design and Learning Process in Problem-Based Learning Environments: Two Cases

    ERIC Educational Resources Information Center

    Hung, Woei; Mehl, Katherine; Holen, Jodi Bergland

    2013-01-01

    Some researchers have argued that the design of problems used in a Problem-based Learning (PBL) course or curriculum could have an impact on student learning cognitively or psychologically, such as students' self-directed learning process or engagement. To investigate the relationship between PBL problem design and students' self-directed learning…

  16. Hidden Realities inside PBL Design Processes: Is Consensus Design an Impossible Clash of Interest between the Individual and the Collective, and Is Architecture Its First Victim?

    ERIC Educational Resources Information Center

    Pihl, Ole

    2015-01-01

    How do architecture students experience the contradictions between the individual and the group at the Department of Architecture and Design of Aalborg University? The Problem-Based Learning model has been extensively applied to the department's degree programs in coherence with the Integrated Design Process, but is a group-based architecture and…

  17. Simplify to survive: prescriptive layouts ensure profitable scaling to 32nm and beyond

    NASA Astrophysics Data System (ADS)

    Liebmann, Lars; Pileggi, Larry; Hibbeler, Jason; Rovner, Vyacheslav; Jhaveri, Tejas; Northrop, Greg

    2009-03-01

    The time-to-market driven need to maintain concurrent process-design co-development, even in spite of discontinuous patterning, process, and device innovation is reiterated. The escalating design rule complexity resulting from increasing layout sensitivities in physical and electrical yield and the resulting risk to profitable technology scaling is reviewed. Shortcomings in traditional Design for Manufacturability (DfM) solutions are identified and contrasted to the highly successful integrated design-technology co-optimization used for SRAM and other memory arrays. The feasibility of extending memory-style design-technology co-optimization, based on a highly simplified layout environment, to logic chips is demonstrated. Layout density benefits, modeled patterning and electrical yield improvements, as well as substantially improved layout simplicity are quantified in a conventional versus template-based design comparison on a 65nm IBM PowerPC 405 microprocessor core. The adaptability of this highly regularized template-based design solution to different yield concerns and design styles is shown in the extension of this work to 32nm with an increased focus on interconnect redundancy. In closing, the work not covered in this paper, focused on the process side of the integrated process-design co-optimization, is introduced.

  18. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  19. Conceptual Chemical Process Design for Sustainability. ...

    EPA Pesticide Factsheets

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews

  20. Image processing system design for microcantilever-based optical readout infrared arrays

    NASA Astrophysics Data System (ADS)

    Tong, Qiang; Dong, Liquan; Zhao, Yuejin; Gong, Cheng; Liu, Xiaohua; Yu, Xiaomei; Yang, Lei; Liu, Weiyu

    2012-12-01

    Compared with the traditional infrared imaging technology, the new type of optical-readout uncooled infrared imaging technology based on MEMS has many advantages, such as low cost, small size, producing simple. In addition, the theory proves that the technology's high thermal detection sensitivity. So it has a very broad application prospects in the field of high performance infrared detection. The paper mainly focuses on an image capturing and processing system in the new type of optical-readout uncooled infrared imaging technology based on MEMS. The image capturing and processing system consists of software and hardware. We build our image processing core hardware platform based on TI's high performance DSP chip which is the TMS320DM642, and then design our image capturing board based on the MT9P031. MT9P031 is Micron's company high frame rate, low power consumption CMOS chip. Last we use Intel's company network transceiver devices-LXT971A to design the network output board. The software system is built on the real-time operating system DSP/BIOS. We design our video capture driver program based on TI's class-mini driver and network output program based on the NDK kit for image capturing and processing and transmitting. The experiment shows that the system has the advantages of high capturing resolution and fast processing speed. The speed of the network transmission is up to 100Mbps.

  1. [Establishment of design space for production process of traditional Chinese medicine preparation].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Qiao, Yan-Jiang; Wu, Zhi-Sheng; Lin, Zhao-Zhou

    2013-03-01

    The philosophy of quality by design (QbD) is now leading the changes in the drug manufacturing mode from the conventional test-based approach to the science and risk based approach focusing on the detailed research and understanding of the production process. Along with the constant deepening of the understanding of the manufacturing process, the design space will be determined, and the emphasis of quality control will be shifted from the quality standards to the design space. Therefore, the establishment of the design space is core step in the implementation of QbD, and it is of great importance to study the methods for building the design space. This essay proposes the concept of design space for the production process of traditional Chinese medicine (TCM) preparations, gives a systematic introduction of the concept of the design space, analyzes the feasibility and significance to build the design space in the production process of traditional Chinese medicine preparations, and proposes study approaches on the basis of examples that comply with the characteristics of traditional Chinese medicine preparations, as well as future study orientations.

  2. A synthetic design environment for ship design

    NASA Technical Reports Server (NTRS)

    Chipman, Richard R.

    1995-01-01

    Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.

  3. Quantification of construction waste prevented by BIM-based design validation: Case studies in South Korea.

    PubMed

    Won, Jongsung; Cheng, Jack C P; Lee, Ghang

    2016-03-01

    Waste generated in construction and demolition processes comprised around 50% of the solid waste in South Korea in 2013. Many cases show that design validation based on building information modeling (BIM) is an effective means to reduce the amount of construction waste since construction waste is mainly generated due to improper design and unexpected changes in the design and construction phases. However, the amount of construction waste that could be avoided by adopting BIM-based design validation has been unknown. This paper aims to estimate the amount of construction waste prevented by a BIM-based design validation process based on the amount of construction waste that might be generated due to design errors. Two project cases in South Korea were studied in this paper, with 381 and 136 design errors detected, respectively during the BIM-based design validation. Each design error was categorized according to its cause and the likelihood of detection before construction. The case studies show that BIM-based design validation could prevent 4.3-15.2% of construction waste that might have been generated without using BIM. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  5. Moon Munchies: Human Exploration Project Engineering Design Challenge--A Standards-Based Elementary School Model Unit Guide--Design, Build, and Evaluate (Lessons 1-6). Engineering By Design: Advancing Technological Literacy--A Standards-Based Program Series. EP-2007-08-92-MSFC

    ERIC Educational Resources Information Center

    Weaver, Kim M.

    2005-01-01

    In this unit, elementary students design and build a lunar plant growth chamber using the Engineering Design Process. The purpose of the unit is to help students understand and apply the design process as it relates to plant growth on the moon. This guide includes six lessons, which meet a number of national standards and benchmarks in…

  6. Toward a More Flexible Web-Based Framework for Multidisciplinary Design

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Salas, A. O.

    1999-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.

  7. Analyzing Team Based Engineering Design Process in Computer Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Lee, Dong-Kuk; Lee, Eun-Sang

    2016-01-01

    The engineering design process has been largely implemented in a collaborative project format. Recently, technological advancement has helped collaborative problem solving processes such as engineering design to have efficient implementation using computers or online technology. In this study, we investigated college students' interaction and…

  8. Fashion sketch design by interactive genetic algorithms

    NASA Astrophysics Data System (ADS)

    Mok, P. Y.; Wang, X. X.; Xu, J.; Kwok, Y. L.

    2012-11-01

    Computer aided design is vitally important for the modern industry, particularly for the creative industry. Fashion industry faced intensive challenges to shorten the product development process. In this paper, a methodology is proposed for sketch design based on interactive genetic algorithms. The sketch design system consists of a sketch design model, a database and a multi-stage sketch design engine. First, a sketch design model is developed based on the knowledge of fashion design to describe fashion product characteristics by using parameters. Second, a database is built based on the proposed sketch design model to define general style elements. Third, a multi-stage sketch design engine is used to construct the design. Moreover, an interactive genetic algorithm (IGA) is used to accelerate the sketch design process. The experimental results have demonstrated that the proposed method is effective in helping laypersons achieve satisfied fashion design sketches.

  9. Function-based design process for an intelligent ground vehicle vision system

    NASA Astrophysics Data System (ADS)

    Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.

    2010-10-01

    An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.

  10. Computer Aided Process Planning (CAPP): The User Interface for the Fabrication Module of the Rapid Design System

    DTIC Science & Technology

    1991-01-01

    plan. The Fabrication Planning Module automatically creates a plan using information from the Feature Based Design Environment (FBDE) of the RDS. It...llll By using the user Interface, the final process plan can be modified in many different ways. The translation of a design feature to a more...for the review and modification of a process plan. The Fabrication Planning Module automatically creates a plan using information from the Feature Based

  11. It Takes a Village to Design a Course: Embedding a Librarian in Course Design

    ERIC Educational Resources Information Center

    Mudd, Alex; Summey, Terri; Upson, Matt

    2015-01-01

    Often associated with online learning, instructional design is a process utilized in efficiently designing training and instruction to help ensure effectiveness. Typically, the instructional systems design (ISD) process uses a team-based approach, consisting of an instructor, a facilitator, a designer and a subject matter expert. Although library…

  12. On-Line Critiques in Collaborative Design Studio

    ERIC Educational Resources Information Center

    Sagun, Aysu; Demirkan, Halime

    2009-01-01

    In this study, the Design Collaboration Model (DCM) was developed to provide a medium for the on-line collaboration of the design courses. The model was based on the situated and reflective practice characteristics of the design process. The segmentation method was used to analyse the design process observed both in the design diaries and the…

  13. [Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].

    PubMed

    Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin

    2017-07-01

    In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.

  14. Measuring and assessing maintainability at the end of high level design

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1993-01-01

    Software architecture appears to be one of the main factors affecting software maintainability. Therefore, in order to be able to predict and assess maintainability early in the development process we need to be able to measure the high-level design characteristics that affect the change process. To this end, we propose a measurement approach, which is based on precise assumptions derived from the change process, which is based on Object-Oriented Design principles and is partially language independent. We define metrics for cohesion, coupling, and visibility in order to capture the difficulty of isolating, understanding, designing and validating changes.

  15. On decentralized design: Rationale, dynamics, and effects on decision-making

    NASA Astrophysics Data System (ADS)

    Chanron, Vincent

    The focus of this dissertation is the design of complex systems, including engineering systems such as cars, airplanes, and satellites. Companies who design these systems are under constant pressure to design better products that meet customer expectations, and competition forces them to develop them faster. One of the responses of the industry to these conflicting challenges has been the decentralization of the design responsibilities. The current lack of understanding of the dynamics of decentralized design processes is the main motivation for this research, and places value on the descriptive base. It identifies the main reasons and the true benefits for companies to decentralize the design of their products. It also demonstrates the limitations of this approach by listing the relevant issues and problems created by the decentralization of decisions. Based on these observations, a game-theoretic approach to decentralized design is proposed to model the decisions made during the design process. The dynamics are modeled using mathematical formulations inspired from control theory. Building upon this formalism, the issue of convergence in decentralized design is analyzed: the equilibrium points of the design space are identified and convergent and divergent patterns are recognized. This rigorous investigation of the design process provides motivation and support for proposing new approaches to decentralized design problems. Two methods are developed, which aim at improving the design process in two ways: decreasing the product development time, and increasing the optimality of the final design. The frame of these methods are inspired by eigenstructure decomposition and set-based design, respectively. The value of the research detailed within this dissertation is in the proposed methods which are built upon the sound mathematical formalism developed. The contribution of this work is two fold: rigorous investigation of the design process, and practical support to decision-making in decentralized environments.

  16. Hafnium transistor process design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2009-01-01

    A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.

  17. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  18. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    NASA Technical Reports Server (NTRS)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.

  19. The networked student: A design-based research case study of student constructed personal learning environments in a middle school science course

    NASA Astrophysics Data System (ADS)

    Drexler, Wendy

    This design-based research case study applied a networked learning approach to a seventh grade science class at a public school in the southeastern United States. Students adapted emerging Web applications to construct personal learning environments for in-depth scientific inquiry of poisonous and venomous life forms. The personal learning environments constructed used Application Programming Interface (API) widgets to access, organize, and synthesize content from a number of educational Internet resources and social network connections. This study examined the nature of personal learning environments; the processes students go through during construction, and patterns that emerged. The project was documented from both an instructional and student-design perspective. Findings revealed that students applied the processes of: practicing digital responsibility; practicing digital literacy; organizing content; collaborating and socializing; and synthesizing and creating. These processes informed a model of the networked student that will serve as a framework for future instructional designs. A networked learning approach that incorporates these processes into future designs has implications for student learning, teacher roles, professional development, administrative policies, and delivery. This work is significant in that it shifts the focus from technology innovations based on tools to student empowerment based on the processes required to support learning. It affirms the need for greater attention to digital literacy and responsibility in K12 schools as well as consideration for those skills students will need to achieve success in the 21st century. The design-based research case study provides a set of design principles for teachers to follow when facilitating student construction of personal learning environments.

  20. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  1. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  2. Design-based research in designing the model for educating simulation facilitators.

    PubMed

    Koivisto, Jaana-Maija; Hannula, Leena; Bøje, Rikke Buus; Prescott, Stephen; Bland, Andrew; Rekola, Leena; Haho, Päivi

    2018-03-01

    The purpose of this article is to introduce the concept of design-based research, its appropriateness in creating education-based models, and to describe the process of developing such a model. The model was designed as part of the Nurse Educator Simulation based learning project, funded by the EU's Lifelong Learning program (2013-1-DK1-LEO05-07053). The project partners were VIA University College, Denmark, the University of Huddersfield, UK and Metropolia University of Applied Sciences, Finland. As an outcome of the development process, "the NESTLED model for educating simulation facilitators" (NESTLED model) was generated. This article also illustrates five design principles that could be applied to other pedagogies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Case-based reasoning in design: An apologia

    NASA Technical Reports Server (NTRS)

    Pulaski, Kirt

    1990-01-01

    Three positions are presented and defended: the process of generating solutions in problem solving is viewable as a design task; case-based reasoning is a strong method of problem solving; and a synergism exists between case-based reasoning and design problem solving.

  4. Using an integrative mock-up simulation approach for evidence-based evaluation of operating room design prototypes.

    PubMed

    Bayramzadeh, Sara; Joseph, Anjali; Allison, David; Shultz, Jonas; Abernathy, James

    2018-07-01

    This paper describes the process and tools developed as part of a multidisciplinary collaborative simulation-based approach for iterative design and evaluation of operating room (OR) prototypes. Full-scale physical mock-ups of healthcare spaces offer an opportunity to actively communicate with and to engage multidisciplinary stakeholders in the design process. While mock-ups are increasingly being used in healthcare facility design projects, they are rarely evaluated in a manner to support active user feedback and engagement. Researchers and architecture students worked closely with clinicians and architects to develop OR design prototypes and engaged clinical end-users in simulated scenarios. An evaluation toolkit was developed to compare design prototypes. The mock-up evaluation helped the team make key decisions about room size, location of OR table, intra-room zoning, and doors location. Structured simulation based mock-up evaluations conducted in the design process can help stakeholders visualize their future workspace and provide active feedback. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  6. Integration of mask and silicon metrology in DFM

    NASA Astrophysics Data System (ADS)

    Matsuoka, Ryoichi; Mito, Hiroaki; Sugiyama, Akiyuki; Toyoda, Yasutaka

    2009-03-01

    We have developed a highly integrated method of mask and silicon metrology. The method adopts a metrology management system based on DBM (Design Based Metrology). This is the high accurate contouring created by an edge detection algorithm used in mask CD-SEM and silicon CD-SEM. We have inspected the high accuracy, stability and reproducibility in the experiments of integration. The accuracy is comparable with that of the mask and silicon CD-SEM metrology. In this report, we introduce the experimental results and the application. As shrinkage of design rule for semiconductor device advances, OPC (Optical Proximity Correction) goes aggressively dense in RET (Resolution Enhancement Technology). However, from the view point of DFM (Design for Manufacturability), the cost of data process for advanced MDP (Mask Data Preparation) and mask producing is a problem. Such trade-off between RET and mask producing is a big issue in semiconductor market especially in mask business. Seeing silicon device production process, information sharing is not completely organized between design section and production section. Design data created with OPC and MDP should be linked to process control on production. But design data and process control data are optimized independently. Thus, we provided a solution of DFM: advanced integration of mask metrology and silicon metrology. The system we propose here is composed of followings. 1) Design based recipe creation: Specify patterns on the design data for metrology. This step is fully automated since they are interfaced with hot spot coordinate information detected by various verification methods. 2) Design based image acquisition: Acquire the images of mask and silicon automatically by a recipe based on the pattern design of CD-SEM.It is a robust automated step because a wide range of design data is used for the image acquisition. 3) Contour profiling and GDS data generation: An image profiling process is applied to the acquired image based on the profiling method of the field proven CD metrology algorithm. The detected edges are then converted to GDSII format, which is a standard format for a design data, and utilized for various DFM systems such as simulation. Namely, by integrating pattern shapes of mask and silicon formed during a manufacturing process into GDSII format, it makes it possible to bridge highly accurate pattern profile information over to the design field of various EDA systems. These are fully integrated into design data and automated. Bi-directional cross probing between mask data and process control data is allowed by linking them. This method is a solution for total optimization that covers Design, MDP, mask production and silicon device producing. This method therefore is regarded as a strategic DFM approach in the semiconductor metrology.

  7. Statistical optimization of process parameters for lipase-catalyzed synthesis of triethanolamine-based esterquats using response surface methodology in 2-liter bioreactor.

    PubMed

    Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat

    2013-01-01

    Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.

  8. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  9. Designing for Diverse Learning: Case Study of Place-Based Learning in Design and Technologies Pre-Service Teacher Education

    ERIC Educational Resources Information Center

    Best, Marnie; MacGregor, Denise; Price, Deborah

    2017-01-01

    Place-based learning experiences in Design and Technologies education connect people and place with design processes and products. Drawing on place-based learning, this case study shares the experiences of eight final year pre-service Design and Technologies education students from the University of South Australia as they collaborated with…

  10. Risk based decision tool for space exploration missions

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steve; Moran, Terrence

    2003-01-01

    This paper presents an approach and corresponding tool to assess and analyze the risks involved in a mission during the pre-phase A design process. This approach is based on creating a risk template for each subsystem expert involved in the mission design process and defining appropriate interactions between the templates.

  11. Fragment-based design of kinase inhibitors: a practical guide.

    PubMed

    Erickson, Jon A

    2015-01-01

    Fragment-based drug design has become an important strategy for drug design and development over the last decade. It has been used with particular success in the development of kinase inhibitors, which are one of the most widely explored classes of drug targets today. The application of fragment-based methods to discovering and optimizing kinase inhibitors can be a complicated and daunting task; however, a general process has emerged that has been highly fruitful. Here a practical outline of the fragment process used in kinase inhibitor design and development is laid out with specific examples. A guide to the overall process from initial discovery through fragment screening, including the difficulties in detection, to the computational methods available for use in optimization of the discovered fragments is reported.

  12. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  13. Power management and distribution considerations for a lunar base

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Coleman, Anthony S.

    1991-01-01

    Design philosophies and technology needs for the power management and distribution (PMAD) portion of a lunar base power system are discussed. A process is described whereby mission planners may proceed from a knowledge of the PMAD functions and mission performance requirements to a definition of design options and technology needs. Current research efforts at the NASA LRC to meet the PMAD system needs for a Lunar base are described. Based on the requirements, the lunar base PMAD is seen as best being accomplished by a utility like system, although with some additional demands including autonomous operation and scheduling and accurate, predictive modeling during the design process.

  14. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  15. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive understandings of the engineering process drive the social and emotional roles and skills used in that process. This comparison of mental models with the process that professional designers use defines a problem space for future studies that investigate how to incorporate engineering practices into elementary classrooms. Recommendations for engineering curriculum development and teacher professional development based on this study are presented.

  16. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  17. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    PubMed

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  18. Model-Based PAT for Quality Management in Pharmaceuticals Freeze-Drying: State of the Art

    PubMed Central

    Fissore, Davide

    2017-01-01

    Model-based process analytical technologies can be used for the in-line control and optimization of a pharmaceuticals freeze-drying process, as well as for the off-line design of the process, i.e., the identification of the optimal operating conditions. This paper aims at presenting the state of the art in this field, focusing, particularly, on three groups of systems, namely, those based on the temperature measurement (i.e., the soft sensor), on the chamber pressure measurement (i.e., the systems based on the test of pressure rise and of pressure decrease), and on the sublimation flux estimate (i.e., the tunable diode laser absorption spectroscopy and the valveless monitoring system). The application of these systems for in-line process optimization (e.g., using a model predictive control algorithm) and to get a true quality by design (e.g., through the off-line calculation of the design space of the process) is presented and discussed. PMID:28224123

  19. An Open Computing Infrastructure that Facilitates Integrated Product and Process Development from a Decision-Based Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.

    1996-01-01

    Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.

  20. Propellant injection systems and processes

    NASA Technical Reports Server (NTRS)

    Ito, Jackson I.

    1995-01-01

    The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

  1. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 1A: Summary

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Redhed, D. D.; Kawaguchi, A. S.; Hansen, S. D.; Southall, J. W.

    1973-01-01

    IPAD was defined as a total system oriented to the product design process. This total system was designed to recognize the product design process, individuals and their design process tasks, and the computer-based IPAD System to aid product design. Principal elements of the IPAD System include the host computer and its interactive system software, new executive and data management software, and an open-ended IPAD library of technical programs to match the intended product design process. The basic goal of the IPAD total system is to increase the productivity of the product design organization. Increases in individual productivity were feasible through automation and computer support of routine information handling. Such proven automation can directly decrease cost and flowtime in the product design process.

  2. Loads and low frequency dynamics - An ENVIRONET data base

    NASA Technical Reports Server (NTRS)

    Garba, John A.

    1988-01-01

    The loads and low frequency dynamics data base, part of Environet, is described with particular attention given to its development and contents. The objective of the data base is to provide the payload designer with design approaches and design data to meet STS safety requirements. Currently the data base consists of the following sections: abstract, scope, glossary, requirements, interaction with other environments, summary of the loads analysis process, design considerations, guidelines for payload design loads, information data base, and references.

  3. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  4. Shedding Light on Engineering Design

    ERIC Educational Resources Information Center

    Capobianco, Brenda M.; Nyquist, Chell; Tyrie, Nancy

    2013-01-01

    This article describes the steps incorporated to teach an engineering design process in a fifth-grade science classroom. The engineering design-based activity was an existing scientific inquiry activity using UV light--detecting beads and purposefully creating a series of engineering design-based challenges around the investigation. The…

  5. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  6. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  7. Cognitive Process as a Basis for Intelligent Retrieval Systems Design.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Dhar, Vasant

    1991-01-01

    Two studies of the cognitive processes involved in online document-based information retrieval were conducted. These studies led to the development of five computational models of online document retrieval which were incorporated into the design of an "intelligent" document-based retrieval system. Both the system and the broader implications of…

  8. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  9. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The chemical engineering analysis of the preliminary process design of a process for producing solar cell grade silicon from dichlorosilane is presented. A plant to produce 1,000 MT/yr of silicon is analyzed. Progress and status for the plant design are reported for the primary activities of base case conditions (60 percent), reaction chemistry (50 percent), process flow diagram (35 percent), energy balance (10 percent), property data (10 percent) and equipment design (5 percent).

  10. Rethinking behavioral health processes by using design for six sigma.

    PubMed

    Lucas, Anthony G; Primus, Kelly; Kovach, Jamison V; Fredendall, Lawrence D

    2015-02-01

    Clinical evidence-based practices are strongly encouraged and commonly utilized in the behavioral health community. However, evidence-based practices that are related to quality improvement processes, such as Design for Six Sigma, are often not used in behavioral health care. This column describes the unique partnership formed between a behavioral health care provider in the greater Pittsburgh area, a nonprofit oversight and monitoring agency for behavioral health services, and academic researchers. The authors detail how the partnership used the multistep process outlined in Design for Six Sigma to completely redesign the provider's intake process. Implementation of the redesigned process increased access to care, decreased bad debt and uncollected funds, and improved cash flow--while consumer satisfaction remained high.

  11. Strategies for Success: Uncovering What Makes Students Successful in Design and Learning

    ERIC Educational Resources Information Center

    Apedoe, Xornam S.; Schunn, Christian D.

    2013-01-01

    While the purposes of design and science are often different, they share some key practices and processes. Design-based science learning, which combines the processes of engineering design with scientific inquiry, is one attempt to engage students in scientific reasoning via solving practical problems. Although research suggests that engaging…

  12. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  13. Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider

    NASA Astrophysics Data System (ADS)

    Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.

    2010-03-01

    In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.

  14. Design and fabrication of a 1-DOF drive mode and 2-DOF sense mode micro-gyroscope using SU-8 based UV-LIGA process

    NASA Astrophysics Data System (ADS)

    Verma, Payal; Juneja, Sucheta; Savelyev, Dmitry A.; Khonina, Svetlana N.; Gopal, Ram

    2016-04-01

    This paper presents design and fabrication of a 1-DOF (degree-of-freedom) drive mode and 2-DOF sense mode micro-gyroscope. It is an inherently robust structure and offers a high sense frequency bandwidth. The proposed design utilizes resonance of the1-DOF drive mode oscillator and employs dynamic amplification concept in sense modes to increase the sensitivity while maintaining robustness. The 2-DOF in the sense direction renders the device immune to process imperfections and environmental effects. The design is simulated using FEA software (CoventorWare®). The device is designed considering process compatibility with SU-8 based UV-LIGA process, which is an economical fabrication technique. The complete fabrication process is presented along with SEM images of the fabricated device. The device has 9 µm thick Nickel as the key structural layer with an overall reduced key structure size of 2.2 mm by 2.1 mm.

  15. Design Process-System and Methodology of Design Research

    NASA Astrophysics Data System (ADS)

    Bashier, Fathi

    2017-10-01

    Studies have recognized the failure of the traditional design approach both in practice and in the studio. They showed that design problems today are too complex for the traditional approach to cope with and reflected a new interest in a better quality design services in order to meet the challenges of our time. In the mid-1970s and early 1980s, there has been a significant shift in focus within the field of design research towards the aim of creating a ‘design discipline’. The problem, as will be discussed, is the lack of an integrated theory of design knowledge that can explicitly describe the design process in a coherent way. As a consequence, the traditional approach fails to operate systematically, in a disciplinary manner. Addressing this problem is the primary goal of the research study in the design process currently being conducted in the research-based master studio at Wollega University, Ethiopia. The research study seeks to make a contribution towards a disciplinary approach, through proper understanding the mechanism of knowledge development within design process systems. This is the task of the ‘theory of design knowledge’. In this article the research project is introduced, and a model of the design process-system is developed in the studio as a research plan and a tool of design research at the same time. Based on data drawn from students’ research projects, the theory of design knowledge is developed and empirically verified through the research project.

  16. Centralized PI control for high dimensional multivariable systems based on equivalent transfer function.

    PubMed

    Luan, Xiaoli; Chen, Qiang; Liu, Fei

    2014-09-01

    This article presents a new scheme to design full matrix controller for high dimensional multivariable processes based on equivalent transfer function (ETF). Differing from existing ETF method, the proposed ETF is derived directly by exploiting the relationship between the equivalent closed-loop transfer function and the inverse of open-loop transfer function. Based on the obtained ETF, the full matrix controller is designed utilizing the existing PI tuning rules. The new proposed ETF model can more accurately represent the original processes. Furthermore, the full matrix centralized controller design method proposed in this paper is applicable to high dimensional multivariable systems with satisfactory performance. Comparison with other multivariable controllers shows that the designed ETF based controller is superior with respect to design-complexity and obtained performance. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Pattern centric design based sensitive patterns and process monitor in manufacturing

    NASA Astrophysics Data System (ADS)

    Hsiang, Chingyun; Cheng, Guojie; Wu, Kechih

    2017-03-01

    When design rule is mitigating to smaller dimension, process variation requirement is tighter than ever and challenges the limits of device yield. Masks, lithography, etching and other processes have to meet very tight specifications in order to keep defect and CD within the margins of the process window. Conventionally, Inspection and metrology equipments are utilized to monitor and control wafer quality in-line. In high throughput optical inspection, nuisance and review-classification become a tedious labor intensive job in manufacturing. Certain high-resolution SEM images are taken to validate defects after optical inspection. These high resolution SEM images catch not only optical inspection highlighted point, also its surrounding patterns. However, this pattern information is not well utilized in conventional quality control method. Using this complementary design based pattern monitor not only monitors and analyzes the variation of patterns sensitivity but also reduce nuisance and highlight defective patterns or killer defects. After grouping in either single or multiple layers, systematic defects can be identified quickly in this flow. In this paper, we applied design based pattern monitor in different layers to monitor process variation impacts on all kinds of patterns. First, the contour of high resolutions SEM image is extracted and aligned to design with offset adjustment and fine alignment [1]. Second, specified pattern rules can be applied on design clip area, the same size as SEM image, and form POI (pattern of interest) areas. Third, the discrepancy of contour and design measurement at different pattern types in measurement blocks. Fourth, defective patterns are reported by discrepancy detection criteria and pattern grouping [4]. Meanwhile, reported pattern defects are ranked by number and severity by discrepancy. In this step, process sensitive high repeatable systematic defects can be identified quickly Through this design based process pattern monitor method, most of optical inspection nuisances can be filtered out at contour to design discrepancy measurement. Daily analysis results are stored at database as reference to compare with incoming data. Defective pattern library contains existing and known systematic defect patterns which help to catch and identify new pattern defects or process impacts. On the other hand, this defect pattern library provides extra valuable information for mask, pattern and defects verification, inspection care area generation, further OPC fix and process enhancement and investigation.

  18. Software synthesis using generic architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.

  19. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  20. Comparative effectiveness of colony-stimulating factors in febrile neutropenia prophylaxis: how results are affected by research design.

    PubMed

    Henk, Henry J; Li, Xiaoyan; Becker, Laura K; Xu, Hairong; Gong, Qi; Deeter, Robert G; Barron, Richard L

    2015-01-01

    To examine the impact of research design on results in two published comparative effectiveness studies. Guidelines for comparative effectiveness research have recommended incorporating disease process in study design. Based on the recommendations, we develop a checklist of considerations and apply the checklist in review of two published studies on comparative effectiveness of colony-stimulating factors. Both studies used similar administrative claims data, but different methods, which resulted in directionally different estimates. Major design differences between the two studies include: whether the timing of intervention in disease process was identified and whether study cohort and outcome assessment period were defined based on this temporal relationship. Disease process and timing of intervention should be incorporated into the design of comparative effectiveness studies.

  1. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  2. The Changing Metropolitan Designation Process and Rural America

    ERIC Educational Resources Information Center

    Slifkin, Rebecca T.; Randolph, Randy; Ricketts, Thomas C.

    2004-01-01

    In June 2003, the Office of Management and Budget (OMB) released new county-based designations of Core Based Statistical Areas (CBSAs), replacing Metropolitan Statistical Area designations that were last revised in 1990. In this article, the new designations are briefly described, and counties that have changed classifications are identified.…

  3. Design of virtual simulation experiment based on key events

    NASA Astrophysics Data System (ADS)

    Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu

    2018-06-01

    Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.

  4. Story-Based Pedagogical Agents: A Scaffolding Design Approach for the Process of Historical Inquiry in a Web-Based Self-Learning Environment

    ERIC Educational Resources Information Center

    Fujimoto, Toru

    2010-01-01

    The purpose of this research was to design and evaluate a web-based self-learning environment for historical inquiry embedded with different types of instructional support featuring story-based pedagogical agents. This research focused on designing a learning environment by integrating story-based instruction and pedagogical agents as a means to…

  5. Observer-based perturbation extremum seeking control with input constraints for direct-contact membrane distillation process

    NASA Astrophysics Data System (ADS)

    Eleiwi, Fadi; Laleg-Kirati, Taous Meriem

    2018-06-01

    An observer-based perturbation extremum seeking control is proposed for a direct-contact membrane distillation (DCMD) process. The process is described with a dynamic model that is based on a 2D advection-diffusion equation model which has pump flow rates as process inputs. The objective of the controller is to optimise the trade-off between the permeate mass flux and the energy consumption by the pumps inside the process. Cases of single and multiple control inputs are considered through the use of only the feed pump flow rate or both the feed and the permeate pump flow rates. A nonlinear Lyapunov-based observer is designed to provide an estimation for the temperature distribution all over the designated domain of the DCMD process. Moreover, control inputs are constrained with an anti-windup technique to be within feasible and physical ranges. Performance of the proposed structure is analysed, and simulations based on real DCMD process parameters for each control input are provided.

  6. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  7. Exploring a Framework for Professional Development in Curriculum Innovation: Empowering Teachers for Designing Context-Based Chemistry Education

    NASA Astrophysics Data System (ADS)

    Stolk, Machiel J.; de Jong, Onno; Bulte, Astrid M. W.; Pilot, Albert

    2011-05-01

    Involving teachers in early stages of context-based curriculum innovations requires a professional development programme that actively engages teachers in the design of new context-based units. This study considers the implementation of a teacher professional development framework aiming to investigate processes of professional development. The framework is based on Galperin's theory of the internalisation of actions and it is operationalised into a professional development programme to empower chemistry teachers for designing new context-based units. The programme consists of the teaching of an educative context-based unit, followed by the designing of an outline of a new context-based unit. Six experienced chemistry teachers participated in the instructional meetings and practical teaching in their respective classrooms. Data were obtained from meetings, classroom discussions, and observations. The findings indicated that teachers became only partially empowered for designing a new context-based chemistry unit. Moreover, the process of professional development leading to teachers' empowerment was not carried out as intended. It is concluded that the elaboration of the framework needs improvement. The implications for a new programme are discussed.

  8. Effects of Spatial Experiences & Cognitive Styles in the Solution Process of Space-Based Design Problems in the First Year of Architectural Design Education

    ERIC Educational Resources Information Center

    Erkan Yazici, Yasemin

    2013-01-01

    There are many factors that influence designers in the architectural design process. Cognitive style, which varies according to the cognitive structure of persons, and spatial experience, which is created with spatial data acquired during life are two of these factors. Designers usually refer to their spatial experiences in order to find solutions…

  9. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  10. Programming and machining of complex parts based on CATIA solid modeling

    NASA Astrophysics Data System (ADS)

    Zhu, Xiurong

    2017-09-01

    The complex parts of the use of CATIA solid modeling programming and simulation processing design, elaborated in the field of CNC machining, programming and the importance of processing technology. In parts of the design process, first make a deep analysis on the principle, and then the size of the design, the size of each chain, connected to each other. After the use of backstepping and a variety of methods to calculate the final size of the parts. In the selection of parts materials, careful study, repeated testing, the final choice of 6061 aluminum alloy. According to the actual situation of the processing site, it is necessary to make a comprehensive consideration of various factors in the machining process. The simulation process should be based on the actual processing, not only pay attention to shape. It can be used as reference for machining.

  11. Peer review in design: Understanding the impact of collaboration on the review process and student perception

    NASA Astrophysics Data System (ADS)

    Mandala, Mahender Arjun

    A cornerstone of design and design education is frequent situated feedback. With increasing class sizes, and shrinking financial and human resources, providing rich feedback to students becomes increasingly difficult. In the field of writing, web-based peer review--the process of utilizing equal status learners within a class to provide feedback to each other on their work using networked computing systems--has been shown to be a reliable and valid source of feedback in addition to improving student learning. Designers communicate in myriad ways, using the many languages of design and combining visual and descriptive information. This complex discourse of design intent makes peer reviews by design students ambiguous and often not helpful to the receivers of this feedback. Furthermore, engaging students in the review process itself is often difficult. Teams can complement individual diversity and may assist novice designers collectively resolve complex task. However, teams often incur production losses and may be impacted by individual biases. In the current work, we look at utilizing a collaborative team of reviewers, working collectively and synchronously, in generating web based peer reviews in a sophomore engineering design class. Students participated in a cross-over design, conducting peer reviews as individuals and collaborative teams in parallel sequences. Raters coded the feedback generated on the basis of their appropriateness and accuracy. Self-report surveys and passive observation of teams conducting reviews captured student opinion on the process, its value, and the contrasting experience they had conducting team and individual reviews. We found team reviews generated better quality feedback in comparison to individual reviews. Furthermore, students preferred conducting reviews in teams, finding the process 'fun' and engaging. We observed several learning benefits of using collaboration in reviewing including improved understanding of the assessment criteria, roles, expectations, and increased team reflection. These results provide insight into how to improve the review process for instructors and researchers, and forms a basis for future research work in this area. With respect to facilitating peer review process in design based classrooms, we also present recommendations for creating effective review system design and implementation in classroom supported by research and practical experience.

  12. A meta-model based approach for rapid formability estimation of continuous fibre reinforced components

    NASA Astrophysics Data System (ADS)

    Zimmerling, Clemens; Dörr, Dominik; Henning, Frank; Kärger, Luise

    2018-05-01

    Due to their high mechanical performance, continuous fibre reinforced plastics (CoFRP) become increasingly important for load bearing structures. In many cases, manufacturing CoFRPs comprises a forming process of textiles. To predict and optimise the forming behaviour of a component, numerical simulations are applied. However, for maximum part quality, both the geometry and the process parameters must match in mutual regard, which in turn requires numerous numerically expensive optimisation iterations. In both textile and metal forming, a lot of research has focused on determining optimum process parameters, whilst regarding the geometry as invariable. In this work, a meta-model based approach on component level is proposed, that provides a rapid estimation of the formability for variable geometries based on pre-sampled, physics-based draping data. Initially, a geometry recognition algorithm scans the geometry and extracts a set of doubly-curved regions with relevant geometry parameters. If the relevant parameter space is not part of an underlying data base, additional samples via Finite-Element draping simulations are drawn according to a suitable design-table for computer experiments. Time saving parallel runs of the physical simulations accelerate the data acquisition. Ultimately, a Gaussian Regression meta-model is built from the data base. The method is demonstrated on a box-shaped generic structure. The predicted results are in good agreement with physics-based draping simulations. Since evaluations of the established meta-model are numerically inexpensive, any further design exploration (e.g. robustness analysis or design optimisation) can be performed in short time. It is expected that the proposed method also offers great potential for future applications along virtual process chains: For each process step along the chain, a meta-model can be set-up to predict the impact of design variations on manufacturability and part performance. Thus, the method is considered to facilitate a lean and economic part and process design under consideration of manufacturing effects.

  13. Cancer-Related Fatigue in Post-Treatment Cancer Survivors: Theory-Based Development of a Web-Based Intervention

    PubMed Central

    Walsh, Jane C; Groarke, AnnMarie; Moss-Morris, Rona; Morrissey, Eimear; McGuire, Brian E

    2017-01-01

    Background Cancer-related fatigue (CrF) is the most common and disruptive symptom experienced by cancer survivors. We aimed to develop a theory-based, interactive Web-based intervention designed to facilitate self-management and enhance coping with CrF following cancer treatment. Objective The aim of our study was to outline the rationale, decision-making processes, methods, and findings which led to the development of a Web-based intervention to be tested in a feasibility trial. This paper outlines the process and method of development of the intervention. Methods An extensive review of the literature and qualitative research was conducted to establish a therapeutic approach for this intervention, based on theory. The psychological principles used in the development process are outlined, and we also clarify hypothesized causal mechanisms. We describe decision-making processes involved in the development of the content of the intervention, input from the target patient group and stakeholders, the design of the website features, and the initial user testing of the website. Results The cocreation of the intervention with the experts and service users allowed the design team to ensure that an acceptable intervention was developed. This evidence-based Web-based program is the first intervention of its kind based on self-regulation model theory, with the primary aim of targeting the representations of fatigue and enhancing self-management of CrF, specifically. Conclusions This research sought to integrate psychological theory, existing evidence of effective interventions, empirically derived principles of Web design, and the views of potential users into the systematic planning and design of the intervention of an easy-to-use website for cancer survivors. PMID:28676465

  14. Identifying Complex Cultural Interactions in the Instructional Design Process: A Case Study of a Cross-Border, Cross-Sector Training for Innovation Program

    ERIC Educational Resources Information Center

    Russell, L. Roxanne; Kinuthia, Wanjira L.; Lokey-Vega, Anissa; Tsang-Kosma, Winnie; Madathany, Reeny

    2013-01-01

    The purpose of this research is to identify complex cultural dynamics in the instructional design process of a cross-sector, cross-border training environment by applying Young's (2009) Culture-Based Model (CBM) as a theoretical framework and taxonomy for description of the instructional design process under the conditions of one case. This…

  15. Additive Manufacturing Design Considerations for Liquid Engine Components

    NASA Technical Reports Server (NTRS)

    Whitten, Dave; Hissam, Andy; Baker, Kevin; Rice, Darron

    2014-01-01

    The Marshall Space Flight Center's Propulsion Systems Department has gained significant experience in the last year designing, building, and testing liquid engine components using additive manufacturing. The department has developed valve, duct, turbo-machinery, and combustion device components using this technology. Many valuable lessons were learned during this process. These lessons will be the focus of this presentation. We will present criteria for selecting part candidates for additive manufacturing. Some part characteristics are 'tailor made' for this process. Selecting the right parts for the process is the first step to maximizing productivity gains. We will also present specific lessons we learned about feature geometry that can and cannot be produced using additive manufacturing machines. Most liquid engine components were made using a two-step process. The base part was made using additive manufacturing and then traditional machining processes were used to produce the final part. The presentation will describe design accommodations needed to make the base part and lessons we learned about which features could be built directly and which require the final machine process. Tolerance capabilities, surface finish, and material thickness allowances will also be covered. Additive Manufacturing can produce internal passages that cannot be made using traditional approaches. It can also eliminate a significant amount of manpower by reducing part count and leveraging model-based design and analysis techniques. Information will be shared about performance enhancements and design efficiencies we experienced for certain categories of engine parts.

  16. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  17. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  18. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  19. Manufacturing process design for multi commodities in agriculture

    NASA Astrophysics Data System (ADS)

    Prasetyawan, Yudha; Santosa, Andrian Henry

    2017-06-01

    High-potential commodities within particular agricultural sectors should be accompanied by maximum benefit value that can be attained by both local farmers and business players. In several cases, the business players are small-medium enterprises (SMEs) which have limited resources to perform added value process of the local commodities into the potential products. The weaknesses of SMEs such as the manual production process with low productivity, limited capacity to maintain prices, and unattractive packaging due to conventional production. Agricultural commodity is commonly created into several products such as flour, chips, crackers, oil, juice, and other products. This research was initiated by collecting data by interview method particularly to obtain the perspectives of SMEs as the business players. Subsequently, the information was processed based on the Quality Function Deployment (QFD) to determine House of Quality from the first to fourth level. A proposed design as the result of QFD was produced and evaluated with Technology Assessment Model (TAM) and continued with a revised design. Finally, the revised design was analyzed with financial perspective to obtain the cost structure of investment, operational, maintenance, and workers. The machine that performs manufacturing process, as the result of revised design, was prototyped and tested to determined initial production process. The designed manufacturing process offers IDR 337,897, 651 of Net Present Value (NPV) in comparison with the existing process value of IDR 9,491,522 based on similar production input.

  20. The Artist and Architect: Creativity and Innovation through Role-Based Design

    ERIC Educational Resources Information Center

    Miller, Charles; Hokanson, Brad

    2009-01-01

    This article is the second installment in a four-part "Educational Technology" series exploring a contemporary perspective on the process of instructional design. In this article, the authors reintroduce the framework of Role-Based Design (RBD) and describe practical strategies for its integration into design workflow. Next, they examine their…

  1. Establishment of a Digital Knowledge Conversion Architecture Design Learning with High User Acceptance

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Apollo; Weng, Kuo-Hua

    2017-01-01

    The purpose of this study is to design a knowledge conversion and management digital learning system for architecture design learning, helping students to share, extract, use and create their design knowledge through web-based interactive activities based on socialization, internalization, combination and externalization process in addition to…

  2. Mobile Technology and CAD Technology Integration in Teaching Architectural Design Process for Producing Creative Product

    ERIC Educational Resources Information Center

    Bin Hassan, Isham Shah; Ismail, Mohd Arif; Mustafa, Ramlee

    2011-01-01

    The purpose of this research is to examine the effect of integrating the mobile and CAD technology on teaching architectural design process for Malaysian polytechnic architectural students in producing a creative product. The website is set up based on Caroll's minimal theory, while mobile and CAD technology integration is based on Brown and…

  3. Operational concepts and implementation strategies for the design configuration management process.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trauth, Sharon Lee

    2007-05-01

    This report describes operational concepts and implementation strategies for the Design Configuration Management Process (DCMP). It presents a process-based systems engineering model for the successful configuration management of the products generated during the operation of the design organization as a business entity. The DCMP model focuses on Pro/E and associated activities and information. It can serve as the framework for interconnecting all essential aspects of the product design business. A design operation scenario offers a sense of how to do business at a time when DCMP is second nature within the design organization.

  4. California State Library: Processing Center Design and Specifications. Volume I, System Description and Input Processing.

    ERIC Educational Resources Information Center

    Sherman, Don; Shoffner, Ralph M.

    The scope of the California State Library-Processing Center (CSL-PC) project is to develop the design and specifications for a computerized technical processing center to provide services to a network of participating California libraries. Immediate objectives are: (1) retrospective conversion of card catalogs to a machine-form data base,…

  5. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed Central

    LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346

  6. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed

    LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.

  7. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  8. Integrating ergonomics in design processes: a case study within an engineering consultancy firm.

    PubMed

    Sørensen, Lene Bjerg; Broberg, Ole

    2012-01-01

    This paper reports on a case study within an engineering consultancy firm, where engineering designers and ergonomists were working together on the design of a new hospital sterile processing plant. The objective of the paper is to gain a better understanding of the premises for integrating ergonomics into engineering design processes and how different factors either promote or limit the integration. Based on a grounded theory approach a model illustrating these factors is developed and different hypotheses about how these factors either promote and/or limit the integration of ergonomics into design processes is presented along with the model.

  9. Preparing Instructional Designers for Game-Based Learning: Part III. Game Design as a Collaborative Process

    ERIC Educational Resources Information Center

    Hirumi, Atsusi; Appelman, Bob; Rieber, Lloyd; Van Eck, Richard

    2010-01-01

    In this three part series, four professors who teach graduate level courses on the design of instructional video games discuss their perspectives on preparing instructional designers to optimize game-based learning. Part I set the context for the series and one of four panelists discussed what he believes instructional designers should know about…

  10. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    DOE PAGES

    Bacelli, Giorgio; Coe, Ryan; Patterson, David; ...

    2017-04-01

    Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less

  11. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacelli, Giorgio; Coe, Ryan; Patterson, David

    Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less

  12. A Model-Based Approach to Developing Your Mission Operations System

    NASA Technical Reports Server (NTRS)

    Smith, Robert R.; Schimmels, Kathryn A.; Lock, Patricia D; Valerio, Charlene P.

    2014-01-01

    Model-Based System Engineering (MBSE) is an increasingly popular methodology for designing complex engineering systems. As the use of MBSE has grown, it has begun to be applied to systems that are less hardware-based and more people- and process-based. We describe our approach to incorporating MBSE as a way to streamline development, and how to build a model consisting of core resources, such as requirements and interfaces, that can be adapted and used by new and upcoming projects. By comparing traditional Mission Operations System (MOS) system engineering with an MOS designed via a model, we will demonstrate the benefits to be obtained by incorporating MBSE in system engineering design processes.

  13. Effects of Cloud-Based m-Learning on Student Creative Performance in Engineering Design

    ERIC Educational Resources Information Center

    Chang, Yu-Shan; Chen, Si-Yi; Yu, Kuang-Chao; Chu, Yih-Hsien; Chien, Yu-Hung

    2017-01-01

    This study explored the effects of cloud-based m-learning on students' creative processes and products in engineering design. A nonequivalent pretest-posttest design was adopted, and 62 university students from Taipei City, Taiwan, were recruited as research participants in the study. The results showed that cloud-based m-learning had a positive…

  14. Defining and Building an Enriched Learning and Information Environment.

    ERIC Educational Resources Information Center

    Goodrum, David A.; And Others

    1993-01-01

    Discusses the development of an Enriched Learning and Information Environment (ELIE). Highlights include technology-based and theory-based frameworks for defining ELIEs; a socio-technical definition; a conceptual prototype; a participatory design process, including iterative design through rapid prototyping; and design issues for technology…

  15. Students as Game Designers vs. "Just" Players: Comparison of Two Different Approaches to Location-Based Games Implementation into School Curricula

    ERIC Educational Resources Information Center

    Slussareff, Michaela; Bohácková, Petra

    2016-01-01

    This paper compares two kinds of educational treatment within location-based game approach; learning by playing a location-based game and learning by designing a location-based game. Two parallel elementary school classes were included in our study (N = 27; age 14-15). The "designers" class took part in the whole process of game design…

  16. Designing for Temporal Awareness: The Role of Temporality in Time-Critical Medical Teamwork

    PubMed Central

    Kusunoki, Diana S.; Sarcevic, Aleksandra

    2016-01-01

    This paper describes the role of temporal information in emergency medical teamwork and how time-based features can be designed to support the temporal awareness of clinicians in this fast-paced and dynamic environment. Engagement in iterative design activities with clinicians over the course of two years revealed a strong need for time-based features and mechanisms, including timestamps for tasks based on absolute time and automatic stopclocks measuring time by counting up since task performance. We describe in detail the aspects of temporal awareness central to clinicians’ awareness needs and then provide examples of how we addressed these needs through the design of a shared information display. As an outcome of this process, we define four types of time representation techniques to facilitate the design of time-based features: (1) timestamps based on absolute time, (2) timestamps relative to the process start time, (3) time since task performance, and (4) time until the next required task. PMID:27478880

  17. The assessment of eco-design with a comprehensive index incorporating environmental impact and economic profit

    NASA Astrophysics Data System (ADS)

    Yang, Shuo; Fu, Yun; Wang, Xiuteng; Xu, Bingsheng; Li, Zheng

    2017-11-01

    Eco-design is an advanced design approach which plays an important part in the national innovation project and serves as a key point for the successful transformation of the supply structure. However, the practical implementation of the pro-environmental designs and technologies always faces a dilemma situation, where some processes can effectively control their emissions to protect the environment at relatively high costs, while others pursue the individual interest in making profit by ignoring the possible adverse environmental impacts. Thus, the assessment on the eco-design process must be carried out based on the comprehensive consideration of the economic and environmental aspects. Presently, the assessment systems in China are unable to fully reflect the new environmental technologies regarding their innovative features or performance. Most of the assessment systems adopt scoring method based on the judgments of the experts, which are easy to use but somewhat subjective. The assessment method presented in this paper includes the environmental impact (EI) assessment based on LCA principal and willingness-to-pay theory, and economic profit (EP) assessment mainly based on market price. The results from the assessment are in the form of EI/EP, which evaluate the targeted process from a combined perspective of environmental and economic performance. A case study was carried out upon the utilization process of coal fly ash, which indicates the proposed method can compare different technical processes in an effective and objective manner, and provide explicit and insightful suggestions for decision making.

  18. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  19. Method and apparatus for decoupled thermo-catalytic pollution control

    DOEpatents

    Tabatabaie-Raissi, Ali; Muradov, Nazim Z.; Martin, Eric

    2006-07-11

    A new method for design and scale-up of thermocatalytic processes is disclosed. The method is based on optimizing process energetics by decoupling of the process energetics from the DRE for target contaminants. The technique is applicable to high temperature thermocatalytic reactor design and scale-up. The method is based on the implementation of polymeric and other low-pressure drop support for thermocatalytic media as well as the multifunctional catalytic media in conjunction with a novel rotating fluidized particle bed reactor.

  20. Design of a superconducting 28 GHz ion source magnet for FRIB using a shell-based support structure

    DOE PAGES

    Felice, H.; Rochepault, E.; Hafalia, R.; ...

    2014-12-05

    The Superconducting Magnet Program at the Lawrence Berkeley National Laboratory (LBNL) is completing the design of a 28 GHz NbTi ion source magnet for the Facility for Rare Isotope Beams (FRIB). The design parameters are based on the parameters of the ECR ion source VENUS in operation at LBNL since 2002 featuring a sextupole-in-solenoids configuration. Whereas most of the magnet components (such as conductor, magnetic design, protection scheme) remain very similar to the VENUS magnet components, the support structure of the FRIB ion source uses a different concept. A shell-based support structure using bladders and keys is implemented in themore » design allowing fine tuning of the sextupole preload and reversibility of the magnet assembly process. As part of the design work, conductor insulation scheme, coil fabrication processes and assembly procedures are also explored to optimize performance. We present the main features of the design emphasizing the integrated design approach used at LBNL to achieve this result.« less

  1. High-performance data processing using distributed computing on the SOLIS project

    NASA Astrophysics Data System (ADS)

    Wampler, Stephen

    2002-12-01

    The SOLIS solar telescope collects data at a high rate, resulting in 500 GB of raw data each day. The SOLIS Data Handling System (DHS) has been designed to quickly process this data down to 156 GB of reduced data. The DHS design uses pools of distributed reduction processes that are allocated to different observations as needed. A farm of 10 dual-cpu Linux boxes contains the pools of reduction processes. Control is through CORBA and data is stored on a fibre channel storage area network (SAN). Three other Linux boxes are responsible for pulling data from the instruments using SAN-based ringbuffers. Control applications are Java-based while the reduction processes are written in C++. This paper presents the overall design of the SOLIS DHS and provides details on the approach used to control the pooled reduction processes. The various strategies used to manage the high data rates are also covered.

  2. People-oriented Information Visualization Design

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyong; Zhang, Bolun

    2018-04-01

    In the 21st century with rapid development, in the wake of the continuous progress of science and technology, human society enters the information era and the era of big data, and the lifestyle and aesthetic system also change accordingly, so the emerging field of information visualization is increasingly popular. Information visualization design is the process of visualizing all kinds of tedious information data, so as to quickly accept information and save time-cost. Along with the development of the process of information visualization, information design, also becomes hotter and hotter, and emotional design, people-oriented design is an indispensable part of in the design of information. This paper probes information visualization design through emotional analysis of information design based on the social context of people-oriented experience from the perspective of art design. Based on the three levels of emotional information design: instinct level, behavior level and reflective level research, to explore and discuss information visualization design.

  3. A Web-Based Monitoring System for Multidisciplinary Design Projects

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Salas, Andrea O.; Weston, Robert P.

    1998-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.

  4. A Web-Based System for Monitoring and Controlling Multidisciplinary Design Projects

    NASA Technical Reports Server (NTRS)

    Salas, Andrea O.; Rogers, James L.

    1997-01-01

    In today's competitive environment, both industry and government agencies are under enormous pressure to reduce the time and cost of multidisciplinary design projects. A number of frameworks have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. An examination of current frameworks reveals weaknesses in various areas such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, in conjunction with an existing framework, can improve these areas of weakness. This paper describes a system that executes a sequence of programs, monitors and controls the design process through a Web-based interface, and visualizes intermediate and final results through the use of Java(Tm) applets. A small sample problem, which includes nine processes with two analysis programs that are coupled to an optimizer, is used to demonstrate the feasibility of this approach.

  5. Operationally efficient propulsion system study (OEPSS) data book. Volume 7; Launch Operations Index (LOI) Design Features and Options

    NASA Technical Reports Server (NTRS)

    Ziese, James M.

    1992-01-01

    A design tool of figure of merit was developed that allows the operability of a propulsion system design to be measured. This Launch Operations Index (LOI) relates Operations Efficiency to System Complexity. The figure of Merit can be used by conceptual designers to compare different propulsion system designs based on their impact on launch operations. The LOI will improve the design process by making sure direct launch operations experience is a necessary feedback to the design process.

  6. CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.F. Beesley

    The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less

  7. Design sensitivity analysis and optimization tool (DSO) for sizing design applications

    NASA Technical Reports Server (NTRS)

    Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa

    1992-01-01

    The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.

  8. On-chip temperature-based digital signal processing for customized wireless microcontroller

    NASA Astrophysics Data System (ADS)

    Farhah Razanah Faezal, Siti; Isa, Mohd Nazrin Md; Harun, Azizi; Nizam Mohyar, Shaiful; Bahari Jambek, Asral

    2017-11-01

    Increases in die size and power density inside system-on-chip (SoC) design have brought thermal issue inside the system. Uneven heat-up and increasing in temperature offset on-chip has become a major factor that can limits the system performance. This paper presents the design and simulation of a temperature-based digital signal processing for modern system-on-chip design using the Verilog HDL. This design yields continuous monitoring of temperature and reacts to specified conditions. The simulation of the system has been done on Altera Quartus Software v. 14. With system above, microcontroller can achieve nominal power dissipation and operation is within the temperature range due to the incorporate of an interrupt-based system.

  9. Using task analysis to improve the requirements elicitation in health information system.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  10. Design Of Computer Based Test Using The Unified Modeling Language

    NASA Astrophysics Data System (ADS)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  11. Teacher-Led Design of an Adaptive Learning Environment

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis; Kalles, Dimitris; Gregoriades, Andreas

    2016-01-01

    This paper discusses a requirements engineering process that exemplifies teacher-led design in the case of an envisioned system for adaptive learning. Such a design poses various challenges and still remains an open research issue in the field of adaptive learning. Starting from a scenario-based elicitation method, the whole process was highly…

  12. Reexamining the Implied Role of the Designer

    ERIC Educational Resources Information Center

    Gibbons, A. S.; Merrill, P. F.; Swan, R.; Campbell, J. O.; Christensen, E.; Insalaco, M.; Wilcken, W.

    2008-01-01

    Designers of technology-based instruction wrestle with the possibility that their product will seem cold or mechanical to the user. Given the current understanding of learning as a social process, it is no longer reasonable to restrict the learner's participation in the process of instruction to trivial interactions. We believe that designs will…

  13. Distributed Group Design Process: Lessons Learned.

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ganesan, Radha

    A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…

  14. Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.

    PubMed

    Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M

    2017-01-01

    Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.

  15. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  16. Space shuttle recommendations based on aircraft maintenance experience

    NASA Technical Reports Server (NTRS)

    Spears, J. M.; Fox, C. L.

    1972-01-01

    Space shuttle design recommendations based on aircraft maintenance experience are developed. The recommendations are specifically applied to the landing gear system, nondestructive inspection techniques, hydraulic system design, materials and processes, and program support.

  17. Design of lunar base observatories

    NASA Technical Reports Server (NTRS)

    Johnson, Stewart W.

    1988-01-01

    Several recently suggested concepts for conducting astronomy from a lunar base are cited. Then, the process and sequence of events that will be required to design an observatory to be emplaced on the Moon are examined.

  18. Using Analytics to Transform a Problem-Based Case Library: An Educational Design Research Approach

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Tawfik, Andrew A.

    2018-01-01

    This article describes the iterative design, development, and evaluation of a case-based learning environment focusing on an ill-structured sales management problem. We discuss our processes and situate them within the broader framework of educational design research. The learning environment evolved over the course of three design phases. A…

  19. High School Students' Use of Paper-Based and Internet-Based Information Sources in the Engineering Design Process

    ERIC Educational Resources Information Center

    Pieper, Jon; Mentzer, Nathan

    2013-01-01

    Mentzer and Becker (2011) and Becker and Mentzer (2012) demonstrated that high school students engaged in engineering design problems spent more time accessing information and spent more time designing when provided with Internet access. They studied high school students engaged in an engineering design challenge. The two studies attempted to…

  20. A Web-Based Visualization and Animation Platform for Digital Logic Design

    ERIC Educational Resources Information Center

    Shoufan, Abdulhadi; Lu, Zheng; Huss, Sorin A.

    2015-01-01

    This paper presents a web-based education platform for the visualization and animation of the digital logic design process. This includes the design of combinatorial circuits using logic gates, multiplexers, decoders, and look-up-tables as well as the design of finite state machines. Various configurations of finite state machines can be selected…

  1. Design and development of a layer-based additive manufacturing process for the realization of metal parts of designed mesostructure

    NASA Astrophysics Data System (ADS)

    Williams, Christopher Bryant

    Low-density cellular materials, metallic bodies with gaseous voids, are a unique class of materials that are characterized by their high strength, low mass, good energy absorption characteristics, and good thermal and acoustic insulation properties. In an effort to take advantage of this entire suite of positive mechanical traits, designers are tailoring the cellular mesostructure for multiple design objectives. Unfortunately, existing cellular material manufacturing technologies limit the design space as they are limited to certain part mesostructure, material type, and macrostructure. The opportunity that exists to improve the design of existing products, and the ability to reap the benefits of cellular materials in new applications is the driving force behind this research. As such, the primary research goal of this work is to design, embody, and analyze a manufacturing process that provides a designer the ability to specify the material type, material composition, void morphology, and mesostructure topology for any conceivable part geometry. The accomplishment of this goal is achieved in three phases of research: (1) Design---Following a systematic design process and a rigorous selection exercise, a layer-based additive manufacturing process is designed that is capable of meeting the unique requirements of fabricating cellular material geometry. Specifically, metal parts of designed mesostructure are fabricated via three-dimensional printing of metal oxide ceramic powder followed by post-processing in a reducing atmosphere. (2) Embodiment ---The primary research hypothesis is verified through the use of the designed manufacturing process chain to successfully realize metal parts of designed mesostructure. (3) Modeling & Evaluation ---The designed manufacturing process is modeled in this final research phase so as to increase understanding of experimental results and to establish a foundation for future analytical modeling research. In addition to an analysis of the physics of primitive creation and an investigation of failure modes during the layered fabrication of thin trusses, build time and cost models are presented in order to verify claims of the process's economic benefits. The main contribution of this research is the embodiment of a novel manner for realizing metal parts of designed mesostructure.

  2. A Range Finding Protocol to Support Design for Transcriptomics Experimentation: Examples of In-Vitro and In-Vivo Murine UV Exposure

    PubMed Central

    van Oostrom, Conny T.; Jonker, Martijs J.; de Jong, Mark; Dekker, Rob J.; Rauwerda, Han; Ensink, Wim A.; de Vries, Annemieke; Breit, Timo M.

    2014-01-01

    In transcriptomics research, design for experimentation by carefully considering biological, technological, practical and statistical aspects is very important, because the experimental design space is essentially limitless. Usually, the ranges of variable biological parameters of the design space are based on common practices and in turn on phenotypic endpoints. However, specific sub-cellular processes might only be partially reflected by phenotypic endpoints or outside the associated parameter range. Here, we provide a generic protocol for range finding in design for transcriptomics experimentation based on small-scale gene-expression experiments to help in the search for the right location in the design space by analyzing the activity of already known genes of relevant molecular mechanisms. Two examples illustrate the applicability: in-vitro UV-C exposure of mouse embryonic fibroblasts and in-vivo UV-B exposure of mouse skin. Our pragmatic approach is based on: framing a specific biological question and associated gene-set, performing a wide-ranged experiment without replication, eliminating potentially non-relevant genes, and determining the experimental ‘sweet spot’ by gene-set enrichment plus dose-response correlation analysis. Examination of many cellular processes that are related to UV response, such as DNA repair and cell-cycle arrest, revealed that basically each cellular (sub-) process is active at its own specific spot(s) in the experimental design space. Hence, the use of range finding, based on an affordable protocol like this, enables researchers to conveniently identify the ‘sweet spot’ for their cellular process of interest in an experimental design space and might have far-reaching implications for experimental standardization. PMID:24823911

  3. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  4. Implementation of a Web-Based Collaborative Process Planning System

    NASA Astrophysics Data System (ADS)

    Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi

    Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.

  5. Pattern database applications from design to manufacturing

    NASA Astrophysics Data System (ADS)

    Zhuang, Linda; Zhu, Annie; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    Pattern-based approaches are becoming more common and popular as the industry moves to advanced technology nodes. At the beginning of a new technology node, a library of process weak point patterns for physical and electrical verification are starting to build up and used to prevent known hotspots from re-occurring on new designs. Then the pattern set is expanded to create test keys for process development in order to verify the manufacturing capability and precheck new tape-out designs for any potential yield detractors. With the database growing, the adoption of pattern-based approaches has expanded from design flows to technology development and then needed for mass-production purposes. This paper will present the complete downstream working flows of a design pattern database(PDB). This pattern-based data analysis flow covers different applications across different functional teams from generating enhancement kits to improving design manufacturability, populating new testing design data based on previous-learning, generating analysis data to improve mass-production efficiency and manufacturing equipment in-line control to check machine status consistency across different fab sites.

  6. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  7. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  8. Combining the Old and the New: Designing a Curriculum Based on the Taba Model and the Global Scale of English

    ERIC Educational Resources Information Center

    Aydin, Belgin; Unver, Meral Melek; Alan, Bülent; Saglam, Sercan

    2017-01-01

    This paper explains the process of designing a curriculum based on the Taba Model and the Global Scale of English (GSE) in an intensive language education program. The Taba Model emphasizing the involvement of the teachers and the learners in the curriculum development process was combined with the GSE, a psychometric tool measuring language…

  9. The Effects of Integrating Mobile and CAD Technology in Teaching Design Process for Malaysian Polytechnic Architecture Student in Producing Creative Product

    ERIC Educational Resources Information Center

    Hassan, Isham Shah; Ismail, Mohd Arif; Mustapha, Ramlee

    2010-01-01

    The purpose of this research is to examine the effect of integrating the digital media such as mobile and CAD technology on designing process of Malaysian polytechnic architecture students in producing a creative product. A website is developed based on Caroll's minimal theory, while mobile and CAD technology integration is based on Brown and…

  10. Natural Resource Information System, design analysis

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.

  11. Virtual Reality Based Collaborative Design by Children with High-Functioning Autism: Design-Based Flexibility, Identity, and Norm Construction

    ERIC Educational Resources Information Center

    Ke, Fengfeng; Lee, Sungwoong

    2016-01-01

    This exploratory case study examined the process and potential impact of collaborative architectural design and construction in an OpenSimulator-based virtual reality (VR) on the social skills development of children with high-functioning autism (HFA). Two children with a formal medical diagnosis of HFA and one typically developing peer, aged…

  12. Design-Based Research and Video Game Based Learning: Developing the Educational Video Game "Citizen Science"

    ERIC Educational Resources Information Center

    Gaydos, Matthew J.

    2013-01-01

    This paper presents a series of studies detailing the research and development of the educational science video game "Citizen Science." It documents the design process, beginning with the initial grant and ending with a case study of two teachers who used the game in their classrooms. Following a design-based research approach, this…

  13. Development and Validation of a Web-Based Module to Teach Metacognitive Learning Strategies to Students in Higher Education

    ERIC Educational Resources Information Center

    Singh, Oma B.

    2009-01-01

    This study used a design based-research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, Design, Development, Implementation, Evaluation) can be employed to develop a web-based module to teach metacognitive learning strategies to students in higher education. The goal of the study was…

  14. CAD/CAM interface design of excimer laser micro-processing system

    NASA Astrophysics Data System (ADS)

    Jing, Liang; Chen, Tao; Zuo, Tiechuan

    2005-12-01

    Recently CAD/CAM technology has been gradually used in the field of laser processing. The excimer laser micro-processing system just identified G instruction before CAD/CAM interface was designed. However the course of designing a part with G instruction for users is too hard. The efficiency is low and probability of making errors is high. By secondary development technology of AutoCAD with Visual Basic, an application was developed to pick-up each entity's information in graph and convert them to each entity's processing parameters. Also an additional function was added into former controlling software to identify these processing parameters of each entity and realize continue processing of graphic. Based on the above CAD/CAM interface, Users can design a part in AutoCAD instead of using G instruction. The period of designing a part is sharply shortened. This new way of design greatly guarantees the processing parameters of the part is right and exclusive. The processing of complex novel bio-chip has been realized by this new function.

  15. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  16. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  17. Students' Construction of External Representations in Design-Based Learning Situations

    ERIC Educational Resources Information Center

    de Vries, Erica

    2006-01-01

    This article develops a theoretical framework for the study of students' construction of mixed multiple external representations in design-based learning situations involving an adaptation of professional tasks and tools to a classroom setting. The framework draws on research on professional design processes and on learning with multiple external…

  18. Web-Based Learning Design Tool

    ERIC Educational Resources Information Center

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  19. Chang'E-3 data pre-processing system based on scientific workflow

    NASA Astrophysics Data System (ADS)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  20. System importance measures: A new approach to resilient systems-of-systems

    NASA Astrophysics Data System (ADS)

    Uday, Payuna

    Resilience is the ability to withstand and recover rapidly from disruptions. While this attribute has been the focus of research in several fields, in the case of system-of-systems (SoSs), addressing resilience is particularly interesting and challenging. As infrastructure SoSs, such as power, transportation, and communication networks, grow in complexity and interconnectivity, measuring and improving the resilience of these SoSs is vital in terms of safety and providing uninterrupted services. The characteristics of systems-of-systems make analysis and design of resilience challenging. However, these features also offer opportunities to make SoSs resilient using unconventional methods. In this research, we present a new approach to the process of resilience design. The core idea behind the proposed design process is a set of system importance measures (SIMs) that identify systems crucial to overall resilience. Using the results from the SIMs, we determine appropriate strategies from a list of design principles to improve SoS resilience. The main contribution of this research is the development of an aid to design that provides specific guidance on where and how resources need to be targeted. Based on the needs of an SoS, decision-makers can iterate through the design process to identify a set of practical and effective design improvements. We use two case studies to demonstrate how the SIM-based design process can inform decision-making in the context of SoS resilience. The first case study focuses on a naval warfare SoS and describes how the resilience framework can leverage existing simulation models to support end-to-end design. We proceed through stages of the design approach using an agent-based model (ABM) that enables us to demonstrate how simulation tools and analytical models help determine the necessary inputs for the design process and, subsequently, inform decision-making regarding SoS resilience. The second case study considers the urban transportation network in Boston. This case study focuses on interpreting the results of the resilience framework and on describing how they can be used to guide design choices in large infrastructure networks. We use different resilience maps to highlight the range of design-related information that can be obtained from the framework. Specific advantages of the SIM-based resilience design include: (1) incorporates SoS- specific features within existing risk-based design processes - the SIMs determine the relative importance of different systems based on their impacts on SoS-level performance, and suggestions for resilience improvement draw from design options that leverage SoS- specific characteristics, such as the ability to adapt quickly (such as add new systems or re-task existing ones) and to provide partial recovery of performance in the aftermath of a disruption; (2) allows rapid understanding of different areas of concern within the SoS - the visual nature of the resilience map (a key outcome of the SIM analysis) provides a useful way to summarize the current resilience of the SoS as well as point to key systems of concern; and (3) provides a platform for multiple analysts and decision- makers to study, modify, discuss and document options for SoS.

  1. Participatory Research as One Piece of the Puzzle: A Systematic Review of Consumer Involvement in Design of Technology-Based Youth Mental Health and Well-Being Interventions.

    PubMed

    Orlowski, Simone Kate; Lawn, Sharon; Venning, Anthony; Winsall, Megan; Jones, Gabrielle M; Wyld, Kaisha; Damarell, Raechel A; Antezana, Gaston; Schrader, Geoffrey; Smith, David; Collin, Philippa; Bidargaddi, Niranjan

    2015-07-09

    Despite the potential of technology-based mental health interventions for young people, limited uptake and/or adherence is a significant challenge. It is thought that involving young people in the development and delivery of services designed for them leads to better engagement. Further research is required to understand the role of participatory approaches in design of technology-based mental health and well-being interventions for youth. To investigate consumer involvement processes and associated outcomes from studies using participatory methods in development of technology-based mental health and well-being interventions for youth. Fifteen electronic databases, using both resource-specific subject headings and text words, were searched describing 2 broad concepts-participatory research and mental health/illness. Grey literature was accessed via Google Advanced search, and relevant conference Web sites and reference lists were also searched. A first screening of titles/abstracts eliminated irrelevant citations and documents. The remaining citations were screened by a second reviewer. Full text articles were double screened. All projects employing participatory research processes in development and/or design of (ICT/digital) technology-based youth mental health and well-being interventions were included. No date restrictions were applied; English language only. Data on consumer involvement, research and design process, and outcomes were extracted via framework analysis. A total of 6210 studies were reviewed, 38 full articles retrieved, and 17 included in this study. It was found that consumer participation was predominantly consultative and consumerist in nature and involved design specification and intervention development, and usability/pilot testing. Sustainable participation was difficult to achieve. Projects reported clear dichotomies around designer/researcher and consumer assumptions of effective and acceptable interventions. It was not possible to determine the impact of participatory research on intervention effectiveness due to lack of outcome data. Planning for or having pre-existing implementation sites assisted implementation. The review also revealed a lack of theory-based design and process evaluation. Consumer consultations helped shape intervention design. However, with little evidence of outcomes and a lack of implementation following piloting, the value of participatory research remains unclear.

  2. Participatory Research as One Piece of the Puzzle: A Systematic Review of Consumer Involvement in Design of Technology-Based Youth Mental Health and Well-Being Interventions

    PubMed Central

    Lawn, Sharon; Venning, Anthony; Winsall, Megan; Jones, Gabrielle M; Wyld, Kaisha; Damarell, Raechel A; Antezana, Gaston; Schrader, Geoffrey; Smith, David; Collin, Philippa; Bidargaddi, Niranjan

    2015-01-01

    Background Despite the potential of technology-based mental health interventions for young people, limited uptake and/or adherence is a significant challenge. It is thought that involving young people in the development and delivery of services designed for them leads to better engagement. Further research is required to understand the role of participatory approaches in design of technology-based mental health and well-being interventions for youth. Objective To investigate consumer involvement processes and associated outcomes from studies using participatory methods in development of technology-based mental health and well-being interventions for youth. Methods Fifteen electronic databases, using both resource-specific subject headings and text words, were searched describing 2 broad concepts-participatory research and mental health/illness. Grey literature was accessed via Google Advanced search, and relevant conference Web sites and reference lists were also searched. A first screening of titles/abstracts eliminated irrelevant citations and documents. The remaining citations were screened by a second reviewer. Full text articles were double screened. All projects employing participatory research processes in development and/or design of (ICT/digital) technology-based youth mental health and well-being interventions were included. No date restrictions were applied; English language only. Data on consumer involvement, research and design process, and outcomes were extracted via framework analysis. Results A total of 6210 studies were reviewed, 38 full articles retrieved, and 17 included in this study. It was found that consumer participation was predominantly consultative and consumerist in nature and involved design specification and intervention development, and usability/pilot testing. Sustainable participation was difficult to achieve. Projects reported clear dichotomies around designer/researcher and consumer assumptions of effective and acceptable interventions. It was not possible to determine the impact of participatory research on intervention effectiveness due to lack of outcome data. Planning for or having pre-existing implementation sites assisted implementation. The review also revealed a lack of theory-based design and process evaluation. Conclusions Consumer consultations helped shape intervention design. However, with little evidence of outcomes and a lack of implementation following piloting, the value of participatory research remains unclear. PMID:27025279

  3. The Design and Development of Test Platform for Wheat Precision Seeding Based on Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Li, Qing; Lin, Haibo; Xiu, Yu-Feng; Wang, Ruixue; Yi, Chuijie

    The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces the whole structure, design parameters of the platform and hardware & software of the image acquisition system were introduced, as well as the method of seed identification and seed-space measurement using image's threshold and counting the seed's center. By analyzing the experimental result, the measurement error is less than ± 1mm.

  4. The jABC Approach to Rigorous Collaborative Development of SCM Applications

    NASA Astrophysics Data System (ADS)

    Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong

    Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.

  5. Modelling and simulation of a robotic work cell

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Gwiazda, A.; Kost, G.; Banaś, W.

    2017-08-01

    The subject of considerations presented in this work concerns the designing and simulation of a robotic work cell. The designing of robotic cells is the process of synergistic combining the components in the group, combining this groups into specific, larger work units or dividing the large work units into small ones. Combinations or divisions are carried out in the terms of the needs of realization the assumed objectives to be performed in these unit. The designing process bases on the integrated approach what lets to take into consideration all needed elements of this process. Each of the elements of a design process could be an independent design agent which could tend to obtain its objectives.

  6. Design optimization of single mixed refrigerant LNG process using a hybrid modified coordinate descent algorithm

    NASA Astrophysics Data System (ADS)

    Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong

    2018-01-01

    Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.

  7. Layout Design of Human-Machine Interaction Interface of Cabin Based on Cognitive Ergonomics and GA-ACA.

    PubMed

    Deng, Li; Wang, Guohua; Yu, Suihuai

    2016-01-01

    In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method.

  8. Layout Design of Human-Machine Interaction Interface of Cabin Based on Cognitive Ergonomics and GA-ACA

    PubMed Central

    Deng, Li; Wang, Guohua; Yu, Suihuai

    2016-01-01

    In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method. PMID:26884745

  9. Design of the storage location based on the ABC analyses

    NASA Astrophysics Data System (ADS)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  10. Design and evaluation of a wireless sensor network based aircraft strength testing system.

    PubMed

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system.

  11. Design and Evaluation of a Wireless Sensor Network Based Aircraft Strength Testing System

    PubMed Central

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system. PMID:22408521

  12. Study on Capturing Functional Requirements of the New Product Based on Evolution

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Song, Liya; Bai, Zhonghang; Zhang, Peng

    In order to exist in an increasingly competitive global marketplace, it is important for corporations to forecast the evolutionary direction of new products rapidly and effectively. Most products in the world are developed based on the design of existing products. In the product design, capturing functional requirements is a key step. Function is continuously evolving, which is driven by the evolution of needs and technologies. So the functional requirements of new product can be forecasted based on the functions of existing product. Eight laws of function evolution are put forward in this paper. The process model of capturing the functional requirements of new product based on function evolution is proposed. An example illustrates the design process.

  13. User-centric design of a personal assistance robot (FRASIER) for active aging.

    PubMed

    Padir, Taşkin; Skorinko, Jeanine; Dimitrov, Velin

    2015-01-01

    We present our preliminary results from the design process for developing the Worcester Polytechnic Institute's personal assistance robot, FRASIER, as an intelligent service robot for enabling active aging. The robot capabilities include vision-based object detection, tracking the user and help with carrying heavy items such as grocery bags or cafeteria trays. This work-in-progress report outlines our motivation and approach to developing the next generation of service robots for the elderly. Our main contribution in this paper is the development of a set of specifications based on the adopted user-centered design process, and realization of the prototype system designed to meet these specifications.

  14. A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes

    NASA Technical Reports Server (NTRS)

    Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw

    2004-01-01

    There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.

  15. The design of multi-core DSP parallel model based on message passing and multi-level pipeline

    NASA Astrophysics Data System (ADS)

    Niu, Jingyu; Hu, Jian; He, Wenjing; Meng, Fanrong; Li, Chuanrong

    2017-10-01

    Currently, the design of embedded signal processing system is often based on a specific application, but this idea is not conducive to the rapid development of signal processing technology. In this paper, a parallel processing model architecture based on multi-core DSP platform is designed, and it is mainly suitable for the complex algorithms which are composed of different modules. This model combines the ideas of multi-level pipeline parallelism and message passing, and summarizes the advantages of the mainstream model of multi-core DSP (the Master-Slave model and the Data Flow model), so that it has better performance. This paper uses three-dimensional image generation algorithm to validate the efficiency of the proposed model by comparing with the effectiveness of the Master-Slave and the Data Flow model.

  16. Lithography hotspot discovery at 70nm DRAM 300mm fab: process window qualification using design base binning

    NASA Astrophysics Data System (ADS)

    Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh

    2008-11-01

    Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.

  17. Interferometric architectures based All-Optical logic design methods and their implementations

    NASA Astrophysics Data System (ADS)

    Singh, Karamdeep; Kaur, Gurmeet

    2015-06-01

    All-Optical Signal Processing is an emerging technology which can avoid costly Optical-electronic-optical (O-E-O) conversions which are usually compulsory in traditional Electronic Signal Processing systems, thus greatly enhancing operating bit rate with some added advantages such as electro-magnetic interference immunity and low power consumption etc. In order to implement complex signal processing tasks All-Optical logic gates are required as backbone elements. This review describes the advances in the field of All-Optical logic design methods based on interferometric architectures such as Mach-Zehnder Interferometer (MZI), Sagnac Interferometers and Ultrafast Non-Linear Interferometer (UNI). All-Optical logic implementations for realization of arithmetic and signal processing applications based on each interferometric arrangement are also presented in a categorized manner.

  18. Efficiency improvement of technological preparation of power equipment manufacturing

    NASA Astrophysics Data System (ADS)

    Milukov, I. A.; Rogalev, A. N.; Sokolov, V. P.; Shevchenko, I. V.

    2017-11-01

    Competitiveness of power equipment primarily depends on speeding-up the development and mastering of new equipment samples and technologies, enhancement of organisation and management of design, manufacturing and operation. Actual political, technological and economic conditions cause the acute need in changing the strategy and tactics of process planning. At that the issues of maintenance of equipment with simultaneous improvement of its efficiency and compatibility to domestically produced components are considering. In order to solve these problems, using the systems of computer-aided process planning for process design at all stages of power equipment life cycle is economically viable. Computer-aided process planning is developed for the purpose of improvement of process planning by using mathematical methods and optimisation of design and management processes on the basis of CALS technologies, which allows for simultaneous process design, process planning organisation and management based on mathematical and physical modelling of interrelated design objects and production system. An integration of computer-aided systems providing the interaction of informative and material processes at all stages of product life cycle is proposed as effective solution to the challenges in new equipment design and process planning.

  19. A Case Study on Collective Cognition and Operation in Team-Based Computer Game Design by Middle-School Children

    ERIC Educational Resources Information Center

    Ke, Fengfeng; Im, Tami

    2014-01-01

    This case study examined team-based computer-game design efforts by children with diverse abilities to explore the nature of their collective design actions and cognitive processes. Ten teams of middle-school children, with a high percentage of minority students, participated in a 6-weeks, computer-assisted math-game-design program. Essential…

  20. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  1. Open Source Software and Design-Based Research Symbiosis in Developing 3D Virtual Learning Environments: Examples from the iSocial Project

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla

    2014-01-01

    Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…

  2. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  3. Increasing the efficiency of designing hemming processes by using an element-based metamodel approach

    NASA Astrophysics Data System (ADS)

    Kaiser, C.; Roll, K.; Volk, W.

    2017-09-01

    In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.

  4. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  5. A combined approach of simulation and analytic hierarchy process in assessing production facility layouts

    NASA Astrophysics Data System (ADS)

    Ramli, Razamin; Cheng, Kok-Min

    2014-07-01

    One of the important areas of concern in order to obtain a competitive level of productivity in a manufacturing system is the layout design and material transportation system (conveyor system). However, changes in customers' requirements have triggered the need to design other alternatives of the manufacturing layout for existing production floor. Hence, this paper discusses effective alternatives of the process layout specifically, the conveyor system layout. Subsequently, two alternative designs for the conveyor system were proposed with the aims to increase the production output and minimize space allocation. The first proposed layout design includes the installation of conveyor oven in the particular manufacturing room based on priority, and the second one is the one without the conveyor oven in the layout. Simulation technique was employed to design the new facility layout. Eventually, simulation experiments were conducted to understand the performance of each conveyor layout design based on operational characteristics, which include predicting the output of layouts. Utilizing the Analytic Hierarchy Process (AHP), the newly and improved layout designs were assessed before the final selection was done. As a comparison, the existing conveyor system layout was included in the assessment process. Relevant criteria involved in this layout design problem were identified as (i) usage of space of each design, (ii) operator's utilization rates, (iii) return of investment (ROI) of the layout, and (iv) output of the layout. In the final stage of AHP analysis, the overall priority of each alternative layout was obtained and thus, a selection for final use by the management was made based on the highest priority value. This efficient planning and designing of facility layout in a particular manufacturing setting is able to minimize material handling cost, minimize overall production time, minimize investment in equipment, and optimize utilization of space.

  6. Design and development of a film-based intervention about teenage men and unintended pregnancy: applying the Medical Research Council framework in practice.

    PubMed

    Aventin, Áine; Lohan, Maria; O'Halloran, Peter; Henderson, Marion

    2015-04-01

    Following the UK Medical Research Council's (MRC) guidelines for the development and evaluation of complex interventions, this study aimed to design, develop and optimise an educational intervention about young men and unintended teenage pregnancy based around an interactive film. The process involved identification of the relevant evidence base, development of a theoretical understanding of the phenomenon of unintended teenage pregnancy in relation to young men, and exploratory mixed methods research. The result was an evidence-based, theory-informed, user-endorsed intervention designed to meet the much neglected pregnancy education needs of teenage men and intended to increase both boys' and girls' intentions to avoid an unplanned pregnancy during adolescence. In prioritising the development phase, this paper addresses a gap in the literature on the processes of research-informed intervention design. It illustrates the application of the MRC guidelines in practice while offering a critique and additional guidance to programme developers on the MRC prescribed processes of developing interventions. Key lessons learned were: (1) know and engage the target population and engage gatekeepers in addressing contextual complexities; (2) know the targeted behaviours and model a process of change; and (3) look beyond development to evaluation and implementation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Design of a Model-Based Online Management Information System for Interlibrary Loan Networks.

    ERIC Educational Resources Information Center

    Rouse, Sandra H.; Rouse, William B.

    1979-01-01

    Discusses the design of a model-based management information system in terms of mathematical/statistical, information processing, and human factors issues and presents a prototype system for interlibrary loan networks. (Author/CWM)

  8. lean-ISD.

    ERIC Educational Resources Information Center

    Wallace, Guy W.

    2001-01-01

    Explains lean instructional systems design/development (ISD) as it relates to curriculum architecture design, based on Japan's lean production system. Discusses performance-based systems; ISD models; processes for organizational training and development; curriculum architecture to support job performance; and modular curriculum development. (LRW)

  9. Optimization of phase feeding of starter, grower, and finisher diets for male broilers by mixture experimental design: forty-eight-day production period.

    PubMed

    Roush, W B; Boykin, D; Branton, S L

    2004-08-01

    A mixture experiment, a variant of response surface methodology, was designed to determine the proportion of time to feed broiler starter (23% protein), grower (20% protein), and finisher (18% protein) diets to optimize production and processing variables based on a total production time of 48 d. Mixture designs are useful for proportion problems where the components of the experiment (i.e., length of time the diets were fed) add up to a unity (48 d). The experiment was conducted with day-old male Ross x Ross broiler chicks. The birds were placed 50 birds per pen in each of 60 pens. The experimental design was a 10-point augmented simplex-centroid (ASC) design with 6 replicates of each point. Each design point represented the portion(s) of the 48 d that each of the diets was fed. Formulation of the diets was based on NRC standards. At 49 d, each pen of birds was evaluated for production data including BW, feed conversion, and cost of feed consumed. Then, 6 birds were randomly selected from each pen for processing data. Processing variables included live weight, hot carcass weight, dressing percentage, fat pad percentage, and breast yield (pectoralis major and pectoralis minor weights). Production and processing data were fit to simplex regression models. Model terms determined not to be significant (P > 0.05) were removed. The models were found to be statistically adequate for analysis of the response surfaces. A compromise solution was calculated based on optimal constraints designated for the production and processing data. The results indicated that broilers fed a starter and finisher diet for 30 and 18 d, respectively, would meet the production and processing constraints. Trace plots showed that the production and processing variables were not very sensitive to the grower diet.

  10. Thermal Catalytic Oxidation of Airborne Contaminants by a Reactor Using Ultra-Short Channel Length, Monolithic Catalyst Substrates

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Tomes, K. M.; Tatara, J. D.

    2005-01-01

    Contaminated air, whether in a crewed spacecraft cabin or terrestrial work and living spaces, is a pervasive problem affecting human health, performance, and well being. The need for highly effective, economical air quality processes spans a wide range of terrestrial and space flight applications. Typically, air quality control processes rely on absorption-based processes. Most industrial packed-bed adsorption processes use activated carbon. Once saturated, the carbon is either dumped or regenerated. In either case, the dumped carbon and concentrated waste streams constitute a hazardous waste that must be handled safely while minimizing environmental impact. Thermal catalytic oxidation processes designed to address waste handling issues are moving to the forefront of cleaner air quality control and process gas decontamination processes. Careful consideration in designing the catalyst substrate and reactor can lead to more complete contaminant destruction and poisoning resistance. Maintenance improvements leading to reduced waste handling and process downtime can also be realized. Performance of a prototype thermal catalytic reaction based on ultra-short waste channel, monolith catalyst substrate design, under a variety of process flow and contaminant loading conditions, is discussed.

  11. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  12. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  13. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  14. Self-optimisation and model-based design of experiments for developing a C-H activation flow process.

    PubMed

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei

    2017-01-01

    A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  15. Low-SWaP coincidence processing for Geiger-mode LIDAR video

    NASA Astrophysics Data System (ADS)

    Schultz, Steven E.; Cervino, Noel P.; Kurtz, Zachary D.; Brown, Myron Z.

    2015-05-01

    Photon-counting Geiger-mode lidar detector arrays provide a promising approach for producing three-dimensional (3D) video at full motion video (FMV) data rates, resolution, and image size from long ranges. However, coincidence processing required to filter raw photon counts is computationally expensive, generally requiring significant size, weight, and power (SWaP) and also time. In this paper, we describe a laboratory test-bed developed to assess the feasibility of low-SWaP, real-time processing for 3D FMV based on Geiger-mode lidar. First, we examine a design based on field programmable gate arrays (FPGA) and demonstrate proof-of-concept results. Then we examine a design based on a first-of-its-kind embedded graphical processing unit (GPU) and compare performance with the FPGA. Results indicate feasibility of real-time Geiger-mode lidar processing for 3D FMV and also suggest utility for real-time onboard processing for mapping lidar systems.

  16. Designing a process for executing projects under an international agreement

    NASA Technical Reports Server (NTRS)

    Mohan, S. N.

    2003-01-01

    Projects executed under an international agreement require special arrangements in order to operate within confines of regulations issued by the State Department and the Commerce Department. In order to communicate enterprise-level guidance and procedural information uniformly to projects based on interpretations that carry the weight of institutional authority, a process was developed. This paper provides a script for designing processes in general, using this particular process for context. While the context is incidental, the method described is applicable to any process in general. The paper will expound on novel features utilized for dissemination of the procedural details over the Internet following such process design.

  17. EVALUATING POLLUTION PREVENTION PROGRESS (P2P) III: AN ENVIRONMENTAL TOOL FOR SCREENING IN PRODUCT LIFE CYCLE ASSESSMENT AND CHEMICAL PROCESS DESIGN

    EPA Science Inventory

    P2P is a computer-based tool that supports the comparison of process and product alternatives in terms of environmental impacts. This tool provides screening-level information for use in process design and in product LCA. Twenty one impact categories and data for approximately ...

  18. An Exploratory Study of Cost Engineering in Axiomatic Design: Creation of the Cost Model Based on an FR-DP Map

    NASA Technical Reports Server (NTRS)

    Lee, Taesik; Jeziorek, Peter

    2004-01-01

    Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.

  19. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  20. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  1. Indigenous lunar construction materials

    NASA Technical Reports Server (NTRS)

    Rogers, Wayne; Sture, Stein

    1991-01-01

    The objectives are the following: to investigate the feasibility of the use of local lunar resources for construction of a lunar base structure; to develop a material processing method and integrate the method with design and construction of a pressurized habitation structure; to estimate specifications of the support equipment necessary for material processing and construction; and to provide parameters for systems models of lunar base constructions, supply, and operations. The topics are presented in viewgraph form and include the following: comparison of various lunar structures; guidelines for material processing methods; cast lunar regolith; examples of cast basalt components; cast regolith process; processing equipment; mechanical properties of cast basalt; material properties and structural design; and future work.

  2. Revising a Design Course from a Lecture Approach to a Project-Based Learning Approach

    ERIC Educational Resources Information Center

    Kunberger, Tanya

    2013-01-01

    In order to develop the evaluative skills necessary for successful performance of design, a senior, Geotechnical Engineering course was revised to immerse students in the complexity of the design process utilising a project-based learning (PBL) approach to instruction. The student-centred approach stresses self-directed group learning, which…

  3. Project-Based Learning and Design-Focused Projects to Motivate Secondary Mathematics Students

    ERIC Educational Resources Information Center

    Remijan, Kelly W.

    2017-01-01

    This article illustrates how mathematics teachers can develop design-focused projects, related to project-based learning, to motivate secondary mathematics students. With first-hand experience as a secondary mathematics teacher, I provide a series of steps related to the engineering design process, which are helpful to teachers in developing…

  4. A Space-Based Point Design for Global Coherent Doppler Wind Lidar Profiling Matched to the Recent NASA/NOAA Draft Science Requirements

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J.; Emmitt, G. David; Frehlich, Rod G.; Amzajerdian, Farzin; Singh, Upendra N.

    2002-01-01

    An end-to-end point design, including lidar, orbit, scanning, atmospheric, and data processing parameters, for space-based global profiling of atmospheric wind will be presented. The point design attempts to match the recent NASA/NOAA draft science requirements for wind measurement.

  5. Cancer-Related Fatigue in Post-Treatment Cancer Survivors: Theory-Based Development of a Web-Based Intervention.

    PubMed

    Corbett, Teresa; Walsh, Jane C; Groarke, AnnMarie; Moss-Morris, Rona; Morrissey, Eimear; McGuire, Brian E

    2017-07-04

    Cancer-related fatigue (CrF) is the most common and disruptive symptom experienced by cancer survivors. We aimed to develop a theory-based, interactive Web-based intervention designed to facilitate self-management and enhance coping with CrF following cancer treatment. The aim of our study was to outline the rationale, decision-making processes, methods, and findings which led to the development of a Web-based intervention to be tested in a feasibility trial. This paper outlines the process and method of development of the intervention. An extensive review of the literature and qualitative research was conducted to establish a therapeutic approach for this intervention, based on theory. The psychological principles used in the development process are outlined, and we also clarify hypothesized causal mechanisms. We describe decision-making processes involved in the development of the content of the intervention, input from the target patient group and stakeholders, the design of the website features, and the initial user testing of the website. The cocreation of the intervention with the experts and service users allowed the design team to ensure that an acceptable intervention was developed. This evidence-based Web-based program is the first intervention of its kind based on self-regulation model theory, with the primary aim of targeting the representations of fatigue and enhancing self-management of CrF, specifically. This research sought to integrate psychological theory, existing evidence of effective interventions, empirically derived principles of Web design, and the views of potential users into the systematic planning and design of the intervention of an easy-to-use website for cancer survivors. ©Teresa Corbett, Jane C Walsh, AnnMarie Groarke, Rona Moss-Morris, Eimear Morrissey, Brian E McGuire. Originally published in JMIR Cancer (http://cancer.jmir.org), 04.07.2017.

  6. The Computer Aided Aircraft-design Package (CAAP)

    NASA Technical Reports Server (NTRS)

    Yalif, Guy U.

    1994-01-01

    The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.

  7. Design and development of data acquisition system based on WeChat hardware

    NASA Astrophysics Data System (ADS)

    Wang, Zhitao; Ding, Lei

    2018-06-01

    Data acquisition system based on WeChat hardware provides methods for popularization and practicality of data acquisition. The whole system is based on WeChat hardware platform, where the hardware part is developed on DA14580 development board and the software part is based on Alibaba Cloud. We designed service module, logic processing module, data processing module and database module. The communication between hardware and software uses AirSync Protocal. We tested this system by collecting temperature and humidity data, and the result shows that the system can aquisite the temperature and humidity in real time according to settings.

  8. A novel approach of ensuring layout regularity correct by construction in advanced technologies

    NASA Astrophysics Data System (ADS)

    Ahmed, Shafquat Jahan; Vaderiya, Yagnesh; Gupta, Radhika; Parthasarathy, Chittoor; Marin, Jean-Claude; Robert, Frederic

    2017-03-01

    In advanced technology nodes, layout regularity has become a mandatory prerequisite to create robust designs less sensitive to variations in manufacturing process in order to improve yield and minimizing electrical variability. In this paper we describe a method for designing regular full custom layouts based on design and process co-optimization. The method includes various design rule checks that can be used on-the-fly during leaf-cell layout development. We extract a Layout Regularity Index (LRI) from the layouts based on the jogs, alignments and pitches used in the design for any given metal layer. Regularity Index of a layout is the direct indicator of manufacturing yield and is used to compare the relative health of different layout blocks in terms of process friendliness. The method has been deployed for 28nm and 40nm technology nodes for Memory IP and is being extended to other IPs (IO, standard-cell). We have quantified the gain of layout regularity with the deployed method on printability and electrical characteristics by process-variation (PV) band simulation analysis and have achieved up-to 5nm reduction in PV band.

  9. Ares Upper Stage Processes to Implement Model Based Design - Going Paperless

    NASA Technical Reports Server (NTRS)

    Gregory, Melanie

    2012-01-01

    Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.

  10. Lignocellulosic Biomass to Ethanol Process Design and Economics Utilizing Co-Current Dilute Acid Prehydrolysis and Enzymatic Hydrolysis for Corn Stover

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aden, A.; Ruth, M.; Ibsen, K.

    This report is an update of NREL's ongoing process design and economic analyses of processes related to developing ethanol from lignocellulosic feedstocks. The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update ofmore » the ongoing process design and economic analyses at NREL. We envision updating this process design report at regular intervals; the purpose being to ensure that the process design incorporates all new data from NREL research, DOE funded research and other sources, and that the equipment costs are reasonable and consistent with good engineering practice for plants of this type. For the non-research areas this means using equipment and process approaches as they are currently used in industrial applications. For the last report, published in 1999, NREL performed a complete review and update of the process design and economic model for the biomass-to-ethanol process utilizing co-current dilute acid prehydrolysis with simultaneous saccharification (enzymatic) and co-fermentation. The process design included the core technologies being researched by the DOE: prehydrolysis, simultaneous saccharification and co-fermentation, and cellulase enzyme production. In addition, all ancillary areas--feed handling, product recovery and purification, wastewater treatment (WWT), lignin combustor and boiler-turbogenerator, and utilities--were included. NREL engaged Delta-T Corporation (Delta-T) to assist in the process design evaluation, the process equipment costing, and overall plant integration. The process design and costing for the lignin combustor and boiler turbogenerator was reviewed by Reaction Engineering Inc. (REI) and Merrick & Company reviewed the wastewater treatment. Since then, NREL has engaged Harris Group (Harris) to perform vendor testing, process design, and costing of critical equipment identified during earlier work. This included solid/liquid separation and pretreatment reactor design and costing. Corn stover handling was also investigated to support DOE's decision to focus on corn stover as a feedstock for lignocellulosic ethanol. Working with Harris, process design and costing for these areas were improved through vendor designs, costing, and vendor testing in some cases. In addition to this work, enzyme costs were adjusted to reflect collaborative work between NREL and enzyme manufacturers (Genencor International and Novozymes Biotech) to provide a delivered enzyme for lignocellulosic feedstocks. This report is the culmination of our work and represents an updated process design and cost basis for the process using a corn stover feedstock. The process design and economic model are useful for predicting the cost benefits of proposed research. Proposed research results can be translated into modifications of the process design, and the economic impact can be assessed. This allows DOE, NREL, and other researchers to set priorities on future research with an understanding of potential reductions to the ethanol production cost. To be economically viable, ethanol production costs must be below market values for ethanol. DOE has chosen a target ethanol selling price of $1.07 per gallon as a goal for 2010. The conceptual design and costs presented here are based on a 2010 plant start-up date. The key research targets required to achieve this design and the $1.07 value are discussed in the report.« less

  11. Transitioning from conceptual design to construction performance specification

    NASA Astrophysics Data System (ADS)

    Jeffers, Paul; Warner, Mark; Craig, Simon; Hubbard, Robert; Marshall, Heather

    2012-09-01

    On successful completion of a conceptual design review by a funding agency or customer, there is a transition phase before construction contracts can be placed. The nature of this transition phase depends on the Project's approach to construction and the particular subsystem being considered. There are generically two approaches; project retention of design authority and issuance of build to print contracts, or issuance of subsystem performance specifications with controlled interfaces. This paper relates to the latter where a proof of concept (conceptual or reference design) is translated into performance based sub-system specifications for competitive tender. This translation is not a straightforward process and there are a number of different issues to consider in the process. This paper deals with primarily the Telescope mount and Enclosure subsystems. The main subjects considered in this paper are: • Typical status of design at Conceptual Design Review compared with the desired status of Specifications and Interface Control Documents at Request for Quotation. • Options for capture and tracking of system requirements flow down from science / operating requirements and sub-system requirements, and functional requirements derived from reference design. • Requirements that may come specifically from the contracting approach. • Methods for effective use of reference design work without compromising a performance based specification. • Management of project team's expectation relating to design. • Effects on cost estimates from reference design to actual. This paper is based on experience and lessons learned through this process on both the VISTA and the ATST projects.

  12. Improving Students' Revision of Physics Concepts through ICT-Based Co-construction and Prescriptive Tutoring

    NASA Astrophysics Data System (ADS)

    Soong, Benson; Mercer, Neil

    2011-05-01

    In this paper, we describe and discuss an information and communication technology (ICT)-based intervention designed to improve secondary school students' revision (in contrast to learning) of physics concepts. We show that students' engagement in joint activities via our ICT-based intervention can provide them (and their teachers) with insights into their knowledge base and thought processes, thereby aiding a remedial process we call prescriptive tutoring. Utilising a design-based research methodology, our intervention is currently being implemented and evaluated in a public secondary school in Singapore. Statistical analysis of pre- and post-intervention test scores from the first iteration of our design experiment show that students in the experimental group significantly out-performed students in both the control and alternate intervention groups. In addition, qualitative data obtained from the students from a focus group session, individual interviews and responses to our survey questions reveal that they became more comfortable with the intervention only after they appreciated how the intervention was designed to help them.

  13. Navy Medical Information Storage and Retrieval System: Navy MEDISTARS. TR-1-71-Part 2, Manual of Indexing Terms; First Edition.

    ERIC Educational Resources Information Center

    Ramsey-Klee, Diane M.

    A computer-based information storage and retrieval system was designed and implemented for processing Navy neuropsychiatric case history reports. The system design objectives were to produce a dynamic and flexible medical information processing tool. The system that was designed has been given the name NAVY MEDical Information STorage and…

  14. Providing Guidance in Virtual Lab Experimentation: The Case of an Experiment Design Tool

    ERIC Educational Resources Information Center

    Efstathiou, Charalampos; Hovardas, Tasos; Xenofontos, Nikoletta A.; Zacharia, Zacharias C.; deJong, Ton; Anjewierden, Anjo; van Riesen, Siswa A. N.

    2018-01-01

    The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students' cognitive processes and inquiry skills before and after…

  15. A Government/Industry Summary of the Design Analysis Methods for Vibrations (DAMVIBS) Program

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G. (Compiler)

    1993-01-01

    The NASA Langley Research Center in 1984 initiated a rotorcraft structural dynamics program, designated DAMVIBS (Design Analysis Methods for VIBrationS), with the objective of establishing the technology base needed by the rotorcraft industry for developing an advanced finite-element-based dynamics design analysis capability for vibrations. An assessment of the program showed that the DAMVIBS Program has resulted in notable technical achievements and major changes in industrial design practice, all of which have significantly advanced the industry's capability to use and rely on finite-element-based dynamics analyses during the design process.

  16. Developing Engineering and Science Process Skills Using Design Software in an Elementary Education

    NASA Astrophysics Data System (ADS)

    Fusco, Christopher

    This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.

  17. The Evolution of Inquiry Activities in the Akamai Observatory Short Course, 2004-2009

    NASA Astrophysics Data System (ADS)

    Rice, E. L.; McElwain, M.; Sonnett, S.; Rafelski, M.

    2010-12-01

    The Akamai Observatory Short Course (AOSC) is a five-day course of activities designed to prepare college students majoring in science, technology, engineering, and mathematics (STEM) fields for internships at observatories on the Big Island of Hawai'i. The design and implementation of inquiry-based activities in the AOSC have evolved considerably over the six years of the course. The content goals have always focused on the basic understanding of light and optics necessary to understand telescopes, but the scientific process goals gradually evolved to reflect the increasingly recognized importance of engineering design skills for successful observatory internships. In 2004 the inquiry-based activities were limited to one well-established Color, Light, and Spectra activity. In subsequent years more activities were customized and expanded upon to reflect the learners' diverse academic backgrounds, the developing goals of the short course, and feedback from internship hosts. The most recent inquiry, the Design and Build a Telescope activity, engaged students in designing and building a simple telescope, emphasizing science and engineering process skills in addition to science content. This activity was influenced by the Mission Design activity, added in 2006, that incorporated the application of inquiry-based learning to the engineering design process and allowed students to draw upon their diverse prior knowledge and experience. In this paper we describe the inquiry-based activities in the AOSC in the context of its year-to-year evolution, including the conceptual and pragmatic changes to the short course that influenced the evolution.

  18. Development of a Web-Based Health Care Intervention for Patients With Heart Disease: Lessons Learned From a Participatory Design Study

    PubMed Central

    2017-01-01

    Background The use of telemedicine technologies in health care has increased substantially, together with a growing interest in participatory design methods when developing telemedicine approaches. Objective We present lessons learned from a case study involving patients with heart disease and health care professionals in the development of a personalized Web-based health care intervention. Methods We used a participatory design approach inspired by the method for feasibility studies in software development. We collected qualitative data using multiple methods in 3 workshops and analyzed the data using thematic analysis. Participants were 7 patients with diagnosis of heart disease, 2 nurses, 1 physician, 2 systems architects, 3 moderators, and 3 observers. Results We present findings in 2 parts. (1) Outcomes of the participatory design process: users gave valuable feedback on ease of use of the platforms’ tracking tools, platform design, terminology, and insights into patients’ monitoring needs, information and communication technologies skills, and preferences for self-management tools. (2) Experiences from the participatory design process: patients and health care professionals contributed different perspectives, with the patients using an experience-based approach and the health care professionals using a more attitude-based approach. Conclusions The essential lessons learned concern planning and organization of workshops, including the finding that patients engaged actively and willingly in a participatory design process, whereas it was more challenging to include and engage health care professionals. PMID:28526674

  19. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  20. Advanced microgrid design and analysis for forward operating bases

    NASA Astrophysics Data System (ADS)

    Reasoner, Jonathan

    This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.

  1. Development of Integrated Modular Avionics Application Based on Simulink and XtratuM

    NASA Astrophysics Data System (ADS)

    Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons

    2013-08-01

    This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.

  2. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Chemical engineering analysis was continued for the HSC process (Hemlock Semiconductor Corporation) in which solar cell silicon is produced in a 1,000 MT/yr plant. Progress and status are reported for the primary engineering activities involved in the preliminary process engineering design of the plant base case conditions (96%), reaction chemistry (96%), process flow diagram (85%), material balance (85%), energy balance (60%), property data (60%), equipment design (40%), major equipment list (30%) and labor requirements (10%). Engineering design of the second distillation column (D-02, TCS column) in the process was completed. The design is based on a 97% recovery of the light key (TCS, trichlorosilane) in the distillate and a 97% recovery of the heavy key (TET, silicon tetrachloride) in the bottoms. At a reflux ratio of 2, the specified recovery of TCS and TET is achieved with 20 trays (equilibrium stages, N=20). Respective feed tray locations are 9, 12 and 15 (NF sub 1 = 9, NF sub 2 = 12,, and NF sub 3 = 15). A total condenser is used for the distillation which is conducted at a pressure of 90 psia.

  3. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Chemical engineering analysis of the HSC process (Hemlock Semiconductor Corporation) for producing silicon from dichlorosilane in a 1,000 MT/yr plant was continued. Progress and status for the chemical engineering analysis of the HSC process are reported for the primary process design engineering activities: base case conditions (85%), reaction chemistry (85%), process flow diagram (60%), material balance (60%), energy balance (30%), property data (30%), equipment design (20%) and major equipment list (10%). Engineering design of the initial distillation column (D-01, stripper column) in the process was initiated. The function of the distillation column is to remove volatile gases (such as hydrogen and nitrogen) which are dissolved in liquid chlorosilanes. Initial specifications and results for the distillation column design are reported including the variation of tray requirements (equilibrium stages) with reflux ratio for the distillation.

  4. A novel double loop control model design for chemical unstable processes.

    PubMed

    Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He

    2014-03-01

    In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. © 2013 ISA Published by ISA All rights reserved.

  5. The Impact of Inquiry Based Instruction on Science Process Skills and Self-Efficacy Perceptions of Pre-Service Science Teachers at a University Level Biology Laboratory

    ERIC Educational Resources Information Center

    Sen, Ceylan; Sezen Vekli, Gülsah

    2016-01-01

    The aim of this study is to determine the influence of inquiry-based teaching approach on pre-service science teachers' laboratory self-efficacy perceptions and scientific process skills. The quasi experimental model with pre-test-post-test control group design was used as an experimental design in this research. The sample of this study included…

  6. Design Tool Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.

  7. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    NASA Astrophysics Data System (ADS)

    Sun, Daner; Looi, Chee-Kit

    2013-02-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as develop critical learning skills through model-based collaborative inquiry approach. It is intended to support collaborative inquiry, real-time social interaction, progressive modeling, and to provide multiple sources of scaffolding for students. We first discuss the theoretical underpinnings for synthesizing the WiMVT design framework, introduce the components and features of the system, and describe the proposed work flow of WiMVT instruction. We also elucidate our research approach that supports the development of the system. Finally, the findings of a pilot study are briefly presented to demonstrate of the potential for learning efficacy of the WiMVT implementation in science learning. Implications are drawn on how to improve the existing system, refine teaching strategies and provide feedback to researchers, designers and teachers. This pilot study informs designers like us on how to narrow the gap between the learning environment's intended design and its actual usage in the classroom.

  8. Pilot scale system for the production of palm-based Monoester-OH

    NASA Astrophysics Data System (ADS)

    Ngah, Muhammad Syukri; Badri, Khairiah Haji

    2016-11-01

    A mechanically agitate reactor vessel in a moderate scale size of 500 L has been developed. This vessel was constructed to produce palm-based polyurethane polyol with a capacity of maximum 400 L. This is to accomodate the demand required for marketing trial run as part of the commercialization intention. The chemistry background of the process design was thoroughly studied. The esterification and condensation in batch process was maintained from the laboratory scale. Only RBD palm kernel oil was used in this study. This paper will describe the engineering design for the reactor vessel development beginning at the stoichiometric equations for the production process to the detail engineering including the equipment selection and fabrication in order to meet the design and objective specifications.

  9. A Process-Philosophical Understanding of Organizational Learning as "Wayfinding": Process, Practices and Sensitivity to Environmental Affordances

    ERIC Educational Resources Information Center

    Chia, Robert

    2017-01-01

    Purpose: This paper aims to articulate a practice-based, non-cognitivist approach to organizational learning. Design/methodology/approach: This paper explores the potential contribution of a process-based "practice turn" in social theory for understanding organizational learning. Findings: In complex, turbulent environments, robust…

  10. DCL System Research Using Advanced Approaches for Land-based or Ship-based Real-Time Recognition and Localization of Marine Mammals

    DTIC Science & Technology

    2012-09-30

    recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA

  11. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. A PBOM configuration and management method based on templates

    NASA Astrophysics Data System (ADS)

    Guo, Kai; Qiao, Lihong; Qie, Yifan

    2018-03-01

    The design of Process Bill of Materials (PBOM) holds a hinge position in the process of product development. The requirements of PBOM configuration design and management for complex products are analysed in this paper, which include the reuse technique of configuration procedure and urgent management need of huge quantity of product family PBOM data. Based on the analysis, the function framework of PBOM configuration and management has been established. Configuration templates and modules are defined in the framework to support the customization and the reuse of configuration process. The configuration process of a detection sensor PBOM is shown as an illustration case in the end. The rapid and agile PBOM configuration and management can be achieved utilizing template-based method, which has a vital significance to improve the development efficiency for complex products.

  13. Event (error and near-miss) reporting and learning system for process improvement in radiation oncology.

    PubMed

    Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin

    2010-09-01

    The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.

  14. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  15. Application of ASP Technology to Realize the Online Administrative License of the earthquake in Hunan Province

    NASA Astrophysics Data System (ADS)

    Tang, Hongliang; Kang, Chengxu; Tian, Youping

    2018-01-01

    Realizing the online handling of administrative approval of earthquakes is an important measure to improve work efficiency and facilitate people’s convenience. Based on the analysis of the characteristics and processes of the administrative licensing in the earthquake industry, this paper proposes an online processing model based on ASP technology and an online processing system based on B/S architecture. This paper presents the design and implementation methods. The application of the system shows that the system is simple in design and full in function, and can be used on mobile platforms such as computers and mobile phones, and has good practicability and forward-lookingness.

  16. Evidence-Based and Value-Based Decision Making About Healthcare Design: An Economic Evaluation of the Safety and Quality Outcomes.

    PubMed

    Zadeh, Rana; Sadatsafavi, Hessam; Xue, Ryan

    2015-01-01

    This study describes a vision and framework that can facilitate the implementation of evidence-based design (EBD), scientific knowledge base into the process of the design, construction, and operation of healthcare facilities and clarify the related safety and quality outcomes for the stakeholders. The proposed framework pairs EBD with value-driven decision making and aims to improve communication among stakeholders by providing a common analytical language. Recent EBD research indicates that the design and operation of healthcare facilities contribute to an organization's operational success by improving safety, quality, and efficiency. However, because little information is available about the financial returns of evidence-based investments, such investments are readily eliminated during the capital-investment decision-making process. To model the proposed framework, we used engineering economy tools to evaluate the return on investments in six successful cases, identified by a literature review, in which facility design and operation interventions resulted in reductions in hospital-acquired infections, patient falls, staff injuries, and patient anxiety. In the evidence-based cases, calculated net present values, internal rates of return, and payback periods indicated that the long-term benefits of interventions substantially outweighed the intervention costs. This article explained a framework to develop a research-based and value-based communication language on specific interventions along the planning, design and construction, operation, and evaluation stages. Evidence-based and value-based design frameworks can be applied to communicate the life-cycle costs and savings of EBD interventions to stakeholders, thereby contributing to more informed decision makings and the optimization of healthcare infrastructures. © The Author(s) 2015.

  17. Taguchi experimental design to determine the taste quality characteristic of candied carrot

    NASA Astrophysics Data System (ADS)

    Ekawati, Y.; Hapsari, A. A.

    2018-03-01

    Robust parameter design is used to design product that is robust to noise factors so the product’s performance fits the target and delivers a better quality. In the process of designing and developing the innovative product of candied carrot, robust parameter design is carried out using Taguchi Method. The method is used to determine an optimal quality design. The optimal quality design is based on the process and the composition of product ingredients that are in accordance with consumer needs and requirements. According to the identification of consumer needs from the previous research, quality dimensions that need to be assessed are the taste and texture of the product. The quality dimension assessed in this research is limited to the taste dimension. Organoleptic testing is used for this assessment, specifically hedonic testing that makes assessment based on consumer preferences. The data processing uses mean and signal to noise ratio calculation and optimal level setting to determine the optimal process/composition of product ingredients. The optimal value is analyzed using confirmation experiments to prove that proposed product match consumer needs and requirements. The result of this research is identification of factors that affect the product taste and the optimal quality of product according to Taguchi Method.

  18. Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara

    Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.

  19. Designing Public Schools.

    ERIC Educational Resources Information Center

    Connections, 2002

    2002-01-01

    Presents an interview with Steven Bingler, an expert on community-based planning and design, about the design of public schools. Topics include the contribution of architecture to student learning, mega- versus small schools, the authentic economics of design decisions, and the role of the community in the design process. (EV)

  20. Determination of mechanical properties for cement-treated aggregate base : final report.

    DOT National Transportation Integrated Search

    2017-06-01

    The Virginia Department of Transportation (VDOT) currently follows pavement design procedures for all new and rehabilitated pavements based on the 1993 AASHTO Guide for Design of Pavement Structures. VDOTs Materials Division is in the process of i...

  1. Power Terminal Communication Access Network Monitoring System Scheme Based on Design Patterns

    NASA Astrophysics Data System (ADS)

    Yan, Shengchao; Wu, Desheng; Zhu, Jiang

    2018-01-01

    In order to realize patterns design for terminal communication monitoring system, this paper introduces manager-workers, tasks-workers design patterns, based on common design patterns such as factory method, chain of responsibility, facade. Using these patterns, the communication monitoring system which combines module-groups like networking communication, business data processing and the peripheral support has been designed successfully. Using these patterns makes this system have great flexibility and scalability and improves the degree of systematic pattern design structure.

  2. Developing and commercializing sustainable new wood products : a process for identifying viable products.

    Treesearch

    Gordon A. Enk; Stuart L. Hart

    2003-01-01

    A process was designed to evaluate the sustainability and potential marketability of USDA Forest Service patented technologies. The process was designed and tested jointly by the University of North Carolina, the University of Michigan, Partners for Strategic Change, and the USDA Forest Service. Two technologies were evaluated: a fiber-based product and a wood fiber/...

  3. Study of the 5E Instructional Model to Improve the Instructional Design Process of Novice Teachers

    ERIC Educational Resources Information Center

    Hu, Jiuhua; Gao, Chong; Liu, Yang

    2017-01-01

    This study investigated the effects of 5E instructional model on the teaching processes of novice teachers. First, we conducted a teaching design training project based on the 5E model for 40 novice teachers, and compared pre-texts of the teachers' teaching process from before the training with post-texts obtained immediately following the…

  4. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: "Choice, Control & Change"

    ERIC Educational Resources Information Center

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2013-01-01

    Objective: To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, "Choice, Control & Change", designed to promote dietary and physical activity behaviors that reduce obesity risk. Design: A process evaluation study based on a systematic conceptual model. Setting: Five…

  5. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    PubMed

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  6. Supporting virtual enterprise design by a web-based information model

    NASA Astrophysics Data System (ADS)

    Li, Dong; Barn, Balbir; McKay, Alison; de Pennington, Alan

    2001-10-01

    Development of IT and its applications have led to significant changes in business processes. To pursue agility, flexibility and best service to customers, enterprises focus on their core competence and dynamically build relationships with partners to form virtual enterprises as customer driven temporary demand chains/networks. Building the networked enterprise needs responsively interactive decisions instead of a single-direction partner selection process. Benefits and risks in the combination should be systematically analysed, and aggregated information about value-adding abilities and risks of networks needs to be derived from interactions of all partners. In this research, a hierarchical information model to assess partnerships for designing virtual enterprises was developed. Internet technique has been applied to the evaluation process so that interactive decisions can be visualised and made responsively during the design process. The assessment is based on the process which allows each partner responds to requirements of the virtual enterprise by planning its operational process as a bidder. The assessment is then produced by making an aggregated value to represent prospect of the combination of partners given current bidding. Final design is a combination of partners with the greatest total value-adding capability and lowest risk.

  7. Design of an MR image processing module on an FPGA chip

    NASA Astrophysics Data System (ADS)

    Li, Limin; Wyrwicz, Alice M.

    2015-06-01

    We describe the design and implementation of an image processing module on a single-chip Field-Programmable Gate Array (FPGA) for real-time image processing. We also demonstrate that through graphical coding the design work can be greatly simplified. The processing module is based on a 2D FFT core. Our design is distinguished from previously reported designs in two respects. No off-chip hardware resources are required, which increases portability of the core. Direct matrix transposition usually required for execution of 2D FFT is completely avoided using our newly-designed address generation unit, which saves considerable on-chip block RAMs and clock cycles. The image processing module was tested by reconstructing multi-slice MR images from both phantom and animal data. The tests on static data show that the processing module is capable of reconstructing 128 × 128 images at speed of 400 frames/second. The tests on simulated real-time streaming data demonstrate that the module works properly under the timing conditions necessary for MRI experiments.

  8. Design of an MR image processing module on an FPGA chip

    PubMed Central

    Li, Limin; Wyrwicz, Alice M.

    2015-01-01

    We describe the design and implementation of an image processing module on a single-chip Field-Programmable Gate Array (FPGA) for real-time image processing. We also demonstrate that through graphical coding the design work can be greatly simplified. The processing module is based on a 2D FFT core. Our design is distinguished from previously reported designs in two respects. No off-chip hardware resources are required, which increases portability of the core. Direct matrix transposition usually required for execution of 2D FFT is completely avoided using our newly-designed address generation unit, which saves considerable on-chip block RAMs and clock cycles. The image processing module was tested by reconstructing multi-slice MR images from both phantom and animal data. The tests on static data show that the processing module is capable of reconstructing 128 × 128 images at speed of 400 frames/second. The tests on simulated real-time streaming data demonstrate that the module works properly under the timing conditions necessary for MRI experiments. PMID:25909646

  9. Differentiating location- and distance-based processes in memory for time: an ERP study.

    PubMed

    Curran, Tim; Friedman, William J

    2003-09-01

    Memory for the time of events may benefit from reconstructive, location-based, and distance-based processes, but these processes are difficult to dissociate with behavioral methods. Neuropsychological research has emphasized the contribution of prefrontal brain mechanisms to memory for time but has not clearly differentiated location- from distance-based processing. The present experiment recorded event-related brain potentials (ERPs) while subjects completed two different temporal memory tests, designed to emphasize either location- or distance-based processing. The subjects' reports of location-based versus distance-based strategies and the reaction time pattern validated our experimental manipulation. Late (800-1,800 msec) frontal ERP effects were related to location-based processing. The results provide support for a two-process theory of memory for time and suggest that frontal memory mechanisms are specifically related to reconstructive, location-based processing.

  10. Landing Gear Integration in Aircraft Conceptual Design. Revision

    NASA Technical Reports Server (NTRS)

    Chai, Sonny T.; Mason, William H.

    1997-01-01

    The design of the landing gear is one of the more fundamental aspects of aircraft design. The design and integration process encompasses numerous engineering disciplines, e.g., structure, weights, runway design, and economics, and has become extremely sophisticated in the last few decades. Although the design process is well-documented, no attempt has been made until now in the development of a design methodology that can be used within an automated environment. As a result, the process remains to be a key responsibility for the configuration designer and is largely experience-based and graphically-oriented. However, as industry and government try to incorporate multidisciplinary design optimization (MDO) methods in the conceptual design phase, the need for a more systematic procedure has become apparent. The development of an MDO-capable design methodology as described in this work is focused on providing the conceptual designer with tools to help automate the disciplinary analyses, i.e., geometry, kinematics, flotation, and weight. Documented design procedures and analyses were examined to determine their applicability, and to ensure compliance with current practices and regulations. Using the latest information as obtained from industry during initial industry survey, the analyses were in terms modified and expanded to accommodate the design criteria associated with the advanced large subsonic transports. Algorithms were then developed based on the updated analysis procedures to be incorporated into existing MDO codes.

  11. Design-Based Learning for Biology: Genetic Engineering Experience Improves Understanding of Gene Expression

    ERIC Educational Resources Information Center

    Ellefson, Michelle R.; Brinker, Rebecca A.; Vernacchio, Vincent J.; Schunn, Christian D.

    2008-01-01

    Gene expression is a difficult topic for students to learn and comprehend, at least partially because it involves various biochemical structures and processes occurring at the microscopic level. Designer Bacteria, a design-based learning (DBL) unit for high-school students, applies principles of DBL to the teaching of gene expression. Throughout…

  12. Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience

    ERIC Educational Resources Information Center

    Zanotti, Francesco

    2012-01-01

    Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…

  13. Web-Based Tools for Designing and Developing Teaching Materials for Integration of Information Technology into Instruction

    ERIC Educational Resources Information Center

    Chang, Kuo-En; Sung, Yao-Ting; Hou, Huei-Tse

    2006-01-01

    Educational software for teachers is an important, yet usually ignored, link for integrating information technology into classroom instruction. This study builds a web-based teaching material design and development system. The process in the system is divided into four stages, analysis, design, development, and practice. Eight junior high school…

  14. Effectiveness of Facebook Based Learning to Enhance Creativity among Islamic Studies Students by Employing Isman Instructional Design Model

    ERIC Educational Resources Information Center

    Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah

    2013-01-01

    The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…

  15. Using Design-Based Research in Informal Environments

    ERIC Educational Resources Information Center

    Reisman, Molly

    2008-01-01

    Design-Based Research (DBR) has been a tool of the learning sciences since the early 1990s, used as a way to improve and study learning environments. Using an iterative process of design with the goal of reining theories of learning, researchers and educators now use DBR seek to identify "how" to make a learning environment work. They then draw…

  16. Developing a Web 2.0-Based System with User-Authored Content for Community Use and Teacher Education

    ERIC Educational Resources Information Center

    Cifuentes, Lauren; Sharp, Amy; Bulu, Sanser; Benz, Mike; Stough, Laura M.

    2010-01-01

    We report on an investigation into the design, development, implementation, and evaluation of an informational and instructional Website in order to generate guidelines for instructional designers of read/write Web environments. We describe the process of design and development research, the problem addressed, the theory-based solution, and the…

  17. An Evaluation-Driven Design Approach to Develop Learning Environments Based on Full-Body Interaction

    ERIC Educational Resources Information Center

    Malinverni, Laura; Schaper, Marie-Monique; Pares, Narcís

    2016-01-01

    The development of learning environments based on full-body interaction has become an increasingly important field of research in recent years. However, the design and evaluation strategies currently used present some significant limitations. Two major shortcomings are: the inadequate involvement of children in the design process and a lack of…

  18. Experimental Learning Enhancing Improvisation Skills

    ERIC Educational Resources Information Center

    Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa

    2016-01-01

    Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…

  19. GREENER CHEMICAL PROCESS DESIGN ALTERNATIVES ARE REVEALED USING THE WASTE REDUCTION DECISION SUPPORT SYSTEM (WAR DSS)

    EPA Science Inventory

    The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...

  20. Implementing an Antibiotic Stewardship Information System to Improve Hospital Infection Control: A Co-Design Process.

    PubMed

    Maia, Mélanie R; Simões, Alexandra; Lapão, Luís V

    2018-01-01

    HAITooL information system design and implementation was based on Design Science Research Methodology, ensuring full participation, in close collaboration, of researchers and a multidisciplinary team of healthcare professionals. HAITooL enables effective monitoring of antibiotic resistance, antibiotic use and provides an antibiotic prescription decision-supporting system by clinicians, strengthening the patient safety procedures. The design, development and implementation process reveals benefits in organizational and behavior change with significant success. Leadership commitment multidisciplinary team and mainly informaticians engagement was crucial to the implementation process. Participants' motivation and the final product delivery and evolution depends on that.

  1. Indigenous lunar construction materials

    NASA Technical Reports Server (NTRS)

    Rogers, Wayne P.; Sture, Stein

    1991-01-01

    The utilization of local resources for the construction and operation of a lunar base can significantly reduce the cost of transporting materials and supplies from Earth. The feasibility of processing lunar regolith to form construction materials and structural components is investigated. A preliminary review of potential processing methods such as sintering, hot-pressing, liquification, and cast basalt techniques, was completed. The processing method proposed is a variation on the cast basalt technique. It involves liquification of the regolith at 1200-1300 C, casting the liquid into a form, and controlled cooling. While the process temperature is higher than that for sintering or hot-pressing (1000-1100 C), this method is expected to yield a true engineering material with low variability in properties, high strength, and the potential to form large structural components. A scenario for this processing method was integrated with a design for a representative lunar base structure and potential construction techniques. The lunar shelter design is for a modular, segmented, pressurized, hemispherical dome which could serve as habitation and laboratory space. Based on this design, estimates of requirements for power, processing equipment, and construction equipment were made. This proposed combination of material processing method, structural design, and support requirements will help to establish the feasibility of lunar base construction using indigenous materials. Future work will refine the steps of the processing method. Specific areas where more information is needed are: furnace characteristics in vacuum; heat transfer during liquification; viscosity, pouring and forming behavior of molten regolith; design of high temperature forms; heat transfer during cooling; recrystallization of basalt; and refinement of estimates of elastic moduli, compressive and tensile strength, thermal expansion coefficient, thermal conductivity, and heat capacity. The preliminary design of the lunar shelter showed us that joining is a critical technology needed for building a structure from large segments. The problem of joining is important to the design of any structure that is not completely prefabricated. It is especially important when the structure is subjected to tensile loading by an internal pressure. For a lunar shelter constructed from large segments the joints between these large segments must be strong, and they must permit automated construction. With a cast basalt building material which is brittle, there is the additional problem of connecting the joint with the material and avoiding stress concentration that would cause failure. Thus, a well-defined project which we intend to pursue during this coming year is the design of joints for cast basalt structural elements.

  2. Dendritic cells for active immunotherapy: optimizing design and manufacture in order to develop commercially and clinically viable products.

    PubMed

    Nicolette, C A; Healey, D; Tcherepanova, I; Whelton, P; Monesmith, T; Coombs, L; Finke, L H; Whiteside, T; Miesowicz, F

    2007-09-27

    Dendritic cell (DC) active immunotherapy is potentially efficacious in a broad array of malignant disease settings. However, challenges remain in optimizing DC-based therapy for maximum clinical efficacy within manufacturing processes that permit quality control and scale-up of consistent products. In this review we discuss the critical issues that must be addressed in order to optimize DC-based product design and manufacture, and highlight the DC based platforms currently addressing these issues. Variables in DC-based product design include the type of antigenic payload used, DC maturation steps and activation processes, and functional assays. Issues to consider in development include: (a) minimizing the invasiveness of patient biological material collection; (b) minimizing handling and manipulations of tissue at the clinical site; (c) centralized product manufacturing and standardized processing and capacity for commercial-scale production; (d) rapid product release turnaround time; (e) the ability to manufacture sufficient product from limited starting material; and (f) standardized release criteria for DC phenotype and function. Improvements in the design and manufacture of DC products have resulted in a handful of promising leads currently in clinical development.

  3. Integrating ergonomics into engineering design: the role of objects.

    PubMed

    Hall-Andersen, Lene Bjerg; Broberg, Ole

    2014-05-01

    The objective of this study was to explore the role of objects in integrating ergonomic knowledge in engineering design processes. An engineering design case was analyzed using the theoretical concepts of boundary objects and intermediary objects: Boundary objects facilitate collaboration between different knowledge domains, while the aim of an intermediary object is to circulate knowledge and thus produce a distant effect. Adjustable layout drawings served as boundary objects and had a positive impact on the dialog between an ergonomist and designers. An ergonomic guideline document was identified as an intermediary object. However, when the ergonomic guidelines were circulated in the design process, only some of the guidelines were transferred to the design of the sterile processing plant. Based on these findings, recommendations for working with objects in design processes are included. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Structural design/margin assessment

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.

    1993-01-01

    Determining structural design inputs and the structural margins following design completion is one of the major activities in space exploration. The end result is a statement of these margins as stability, safety factors on ultimate and yield stresses, fracture limits (fracture control), fatigue lifetime, reuse criteria, operational criteria and procedures, stability factors, deflections, clearance, handling criteria, etc. The process is normally called a load cycle and is time consuming, very complex, and involves much more than structures. The key to successful structural design is the proper implementation of the process. It depends on many factors: leadership and management of the process, adequate analysis and testing tools, data basing, communications, people skills, and training. This process and the various factors involved are discussed.

  5. Developing teaching process for enhancing students' mathematical problem solving in the 21st century through STEM education

    NASA Astrophysics Data System (ADS)

    Prawvichien, Sutthaporn; Siripun, Kulpatsorn; Yuenyong, Chokchai

    2018-01-01

    The STEM education could provide the context for students' learning in the 21st century. The Mathematical problem solving requires a context which simulates real life in order to give students experience of the power of mathematics in the world around them. This study aimed to develop the teaching process for enhancing students' mathematical problem solving in the 21st century through STEM education. The paper will clarify the STEM learning activities about graph theories regarding on the 6 steps of engineering design process. These include identify a challenge, exploring ideas, designing and planning, doing and developing, test and evaluate, and present the solution. The learning activities will start from the Identify a challenge stage which provides the northern part of Thailand flooding situation in order to set the students' tasks of develop the solutions of providing the routes of fastest moving people away from the flooding areas. The explore ideas stage will provide activities for enhance students to learn some knowledge based for designing the possible solutions. This knowledge based could focus on measuring, geometry, graph theory, and mathematical process. The design and plan stage will ask students to model the city based on the map and then provide the possible routes. The doing and development stage will ask students to develop the routes based on their possible model. The test and evaluating will ask students to clarify how to test and evaluate the possible routes, and then test it. The present solution stage will ask students to present the whole process of designing routes. Then, the paper will discuss how these learning activities could enhance students' mathematical problem solving. The paper may have implication for STEM education in school setting.

  6. Multi-objective LQR with optimum weight selection to design FOPID controllers for delayed fractional order processes.

    PubMed

    Das, Saptarshi; Pan, Indranil; Das, Shantanu

    2015-09-01

    An optimal trade-off design for fractional order (FO)-PID controller is proposed with a Linear Quadratic Regulator (LQR) based technique using two conflicting time domain objectives. A class of delayed FO systems with single non-integer order element, exhibiting both sluggish and oscillatory open loop responses, have been controlled here. The FO time delay processes are handled within a multi-objective optimization (MOO) formalism of LQR based FOPID design. A comparison is made between two contemporary approaches of stabilizing time-delay systems withinLQR. The MOO control design methodology yields the Pareto optimal trade-off solutions between the tracking performance and total variation (TV) of the control signal. Tuning rules are formed for the optimal LQR-FOPID controller parameters, using median of the non-dominated Pareto solutions to handle delayed FO processes. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  7. A real-time MTFC algorithm of space remote-sensing camera based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Liting; Huang, Gang; Lin, Zhe

    2018-01-01

    A real-time MTFC algorithm of space remote-sensing camera based on FPGA was designed. The algorithm can provide real-time image processing to enhance image clarity when the remote-sensing camera running on-orbit. The image restoration algorithm adopted modular design. The MTF measurement calculation module on-orbit had the function of calculating the edge extension function, line extension function, ESF difference operation, normalization MTF and MTFC parameters. The MTFC image filtering and noise suppression had the function of filtering algorithm and effectively suppressing the noise. The algorithm used System Generator to design the image processing algorithms to simplify the design structure of system and the process redesign. The image gray gradient dot sharpness edge contrast and median-high frequency were enhanced. The image SNR after recovery reduced less than 1 dB compared to the original image. The image restoration system can be widely used in various fields.

  8. Goal-Based Learning: Conceptual Design "Jump-Start" Workbook.

    ERIC Educational Resources Information Center

    Montgomery, Joel R.

    This workbook explains the process of using the goal-based learning (GBL) approach to accelerating performance change to design an education or training program. The first half of the workbook, which focuses on the nature and benefits of GBL, discusses the following topics: shifting the focus of education; differences between lecture-based and…

  9. Quality Matters™: An Educational Input in an Ongoing Design-Based Research Project

    ERIC Educational Resources Information Center

    Adair, Deborah; Shattuck, Kay

    2015-01-01

    Quality Matters (QM) has been transforming established best practices and online education-based research into an applicable, scalable course level improvement process for the last decade. In this article, the authors describe QM as an ongoing design-based research project and an educational input for improving online education.

  10. When paradigms collide at the road rail interface: evaluation of a sociotechnical systems theory design toolkit for cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G

    2016-09-01

    The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.

  11. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  12. Maximizing the Impact of Program Evaluation: A Discrepancy-Based Process for Educational Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    This paper describes a formative/summative process for educational program evaluation, which is appropriate for higher education programs and is based on M. Provus' Discrepancy Evaluation Model and the principles of instructional design. The Discrepancy Based Methodology for Educational Program Evaluation facilitates systematic and detailed…

  13. Structural design of composite rotor blades with consideration of manufacturability, durability, and manufacturing uncertainties

    NASA Astrophysics Data System (ADS)

    Li, Leihong

    A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.

  14. [Research and realization of signal processing algorithms based on FPGA in digital ophthalmic ultrasonography imaging].

    PubMed

    Fang, Simin; Zhou, Sheng; Wang, Xiaochun; Ye, Qingsheng; Tian, Ling; Ji, Jianjun; Wang, Yanqun

    2015-01-01

    To design and improve signal processing algorithms of ophthalmic ultrasonography based on FPGA. Achieved three signal processing modules: full parallel distributed dynamic filter, digital quadrature demodulation, logarithmic compression, using Verilog HDL hardware language in Quartus II. Compared to the original system, the hardware cost is reduced, the whole image shows clearer and more information of the deep eyeball contained in the image, the depth of detection increases from 5 cm to 6 cm. The new algorithms meet the design requirements and achieve the system's optimization that they can effectively improve the image quality of existing equipment.

  15. C-A1-03: Considerations in the Design and Use of an Oracle-based Virtual Data Warehouse

    PubMed Central

    Bredfeldt, Christine; McFarland, Lela

    2011-01-01

    Background/Aims The amount of clinical data available for research is growing exponentially. As it grows, increasing the efficiency of both data storage and data access becomes critical. Relational database management systems (rDBMS) such as Oracle are ideal solutions for managing longitudinal clinical data because they support large-scale data storage and highly efficient data retrieval. In addition, they can greatly simplify the management of large data warehouses, including security management and regular data refreshes. However, the HMORN Virtual Data Warehouse (VDW) was originally designed based on SAS datasets, and this design choice has a number of implications for both the design and use of an Oracle-based VDW. From a design standpoint, VDW tables are designed as flat SAS datasets, which do not take full advantage of Oracle indexing capabilities. From a data retrieval standpoint, standard VDW SAS scripts do not take advantage of SAS pass-through SQL capabilities to enable Oracle to perform the processing required to narrow datasets to the population of interest. Methods Beginning in 2009, the research department at Kaiser Permanente in the Mid-Atlantic States (KPMA) has developed an Oracle-based VDW according to the HMORN v3 specifications. In order to take advantage of the strengths of relational databases, KPMA introduced an interface layer to the VDW data, using views to provide access to standardized VDW variables. In addition, KPMA has developed SAS programs that provide access to SQL pass-through processing for first-pass data extraction into SAS VDW datasets for processing by standard VDW scripts. Results We discuss both the design and performance considerations specific to the KPMA Oracle-based VDW. We benchmarked performance of the Oracle-based VDW using both standard VDW scripts and an initial pre-processing layer to evaluate speed and accuracy of data return. Conclusions Adapting the VDW for deployment in an Oracle environment required minor changes to the underlying structure of the data. Further modifications of the underlying data structure would lead to performance enhancements. Maximally efficient data access for standard VDW scripts requires an extra step that involves restricting the data to the population of interest at the data server level prior to standard processing.

  16. Methodology for balancing design and process tradeoffs for deep-subwavelength technologies

    NASA Astrophysics Data System (ADS)

    Graur, Ioana; Wagner, Tina; Ryan, Deborah; Chidambarrao, Dureseti; Kumaraswamy, Anand; Bickford, Jeanne; Styduhar, Mark; Wang, Lee

    2011-04-01

    For process development of deep-subwavelength technologies, it has become accepted practice to use model-based simulation to predict systematic and parametric failures. Increasingly, these techniques are being used by designers to ensure layout manufacturability, as an alternative to, or complement to, restrictive design rules. The benefit of model-based simulation tools in the design environment is that manufacturability problems are addressed in a design-aware way by making appropriate trade-offs, e.g., between overall chip density and manufacturing cost and yield. The paper shows how library elements and the full ASIC design flow benefit from eliminating hot spots and improving design robustness early in the design cycle. It demonstrates a path to yield optimization and first time right designs implemented in leading edge technologies. The approach described herein identifies those areas in the design that could benefit from being fixed early, leading to design updates and avoiding later design churn by careful selection of design sensitivities. This paper shows how to achieve this goal by using simulation tools incorporating various models from sparse to rigorously physical, pattern detection and pattern matching, checking and validating failure thresholds.

  17. Towards Methodologies for Building Knowledge-Based Instructional Systems.

    ERIC Educational Resources Information Center

    Duchastel, Philippe

    1992-01-01

    Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…

  18. DEVELOPMENT OF A RATIONALLY BASED DESIGN PROTOCOL FOR THE ULTRAVIOLET LIGHT DISINFECTION PROCESS

    EPA Science Inventory

    A protocol is demonstrated for the design and evaluation of ultraviolet (UV) disinfection systems based on a mathematical model. The disinfection model incorporates the system's physical dimensions, the residence time distribution of the reactor and dispersion characteristics, th...

  19. Platform based design of EAP transducers in Danfoss PolyPower A/S

    NASA Astrophysics Data System (ADS)

    Sarban, Rahimullah; Gudlaugsson, Tómas V.

    2013-04-01

    Electroactive Polymer (EAP) has gained increasing focus, in research communities, in last two decades. Research within the field of EAP has, so far, been mainly focused on material improvements, characterization, modeling and developing demonstrators. As the EAP technology matures, the need for a new area of research namely product development emerges. Product development can be based on an isolated design and production for a single product or platform design where a product family is developed. In platform design the families of products exploits commonality of platform modules while satisfying a variety of different market segments. Platform based approach has the primary benefit of being cost efficient and short lead time to market when new products emerges. Products development based on EAP technology is challenging both technologically as well as from production and processing point of view. Both the technological and processing challenges need to be addressed before a successful implementation of EAP technology into products. Based on this need Danfoss PolyPower A/S has, in 2011, launched a EAP platform project in collaboration with three Danish universities and three commercial organizations. The aim of the project is to develop platform based designs and product family for the EAP components to be used in variety of applications. This paper presents the structure of the platform project as a whole and specifically the platform based designs of EAP transducers. The underlying technologies, essential for EAP transducers, are also presented. Conceptual design and solution for the concepts are presented as well.

  20. Optimal design of geodesically stiffened composite cylindrical shells

    NASA Technical Reports Server (NTRS)

    Gendron, G.; Guerdal, Z.

    1992-01-01

    An optimization system based on the finite element code Computations Structural Mechanics (CSM) Testbed and the optimization program, Automated Design Synthesis (ADS), is described. The optimization system can be used to obtain minimum-weight designs of composite stiffened structures. Ply thickness, ply orientations, and stiffener heights can be used as design variables. Buckling, displacement, and material failure constraints can be imposed on the design. The system is used to conduct a design study of geodesically stiffened shells. For comparison purposes, optimal designs of unstiffened shells and shells stiffened by rings and stingers are also obtained. Trends in the design of geodesically stiffened shells are identified. An approach to include local stress concentrations during the design optimization process is then presented. The method is based on a global/local analysis technique. It employs spline interpolation functions to determine displacements and rotations from a global model which are used as 'boundary conditions' for the local model. The organization of the strategy in the context of an optimization process is described. The method is validated with an example.

  1. Automated recognition of helium speech. Phase I: Investigation of microprocessor based analysis/synthesis system

    NASA Astrophysics Data System (ADS)

    Jelinek, H. J.

    1986-01-01

    This is the Final Report of Electronic Design Associates on its Phase I SBIR project. The purpose of this project is to develop a method for correcting helium speech, as experienced in diver-surface communication. The goal of the Phase I study was to design, prototype, and evaluate a real time helium speech corrector system based upon digital signal processing techniques. The general approach was to develop hardware (an IBM PC board) to digitize helium speech and software (a LAMBDA computer based simulation) to translate the speech. As planned in the study proposal, this initial prototype may now be used to assess expected performance from a self contained real time system which uses an identical algorithm. The Final Report details the work carried out to produce the prototype system. Four major project tasks were: a signal processing scheme for converting helium speech to normal sounding speech was generated. The signal processing scheme was simulated on a general purpose (LAMDA) computer. Actual helium speech was supplied to the simulation and the converted speech was generated. An IBM-PC based 14 bit data Input/Output board was designed and built. A bibliography of references on speech processing was generated.

  2. Fatigue Behavior of Computer-Aided Design/Computer-Assisted Manufacture Ceramic Abutments as a Function of Design and Ceramics Processing.

    PubMed

    Kelly, J Robert; Rungruanganunt, Patchnee

    2016-01-01

    Zirconia is being widely used, at times apparently by simply copying a metal design into ceramic. Structurally, ceramics are sensitive to both design and processing (fabrication) details. The aim of this work was to examine four computer-aided design/computer-assisted manufacture (CAD/CAM) abutments using a modified International Standards Organization (ISO) implant fatigue protocol to determine performance as a function of design and processing. Two full zirconia and two hybrid (Ti-based) abutments (n = 12 each) were tested wet at 15 Hz at a variety of loads to failure. Failure probability distributions were examined at each load, and when found to be the same, data from all loads were combined for lifetime analysis from accelerated to clinical conditions. Two distinctly different failure modes were found for both full zirconia and Ti-based abutments. One of these for zirconia has been reported clinically in the literature, and one for the Ti-based abutments has been reported anecdotally. The ISO protocol modification in this study forced failures in the abutments; no implant bodies failed. Extrapolated cycles for 10% failure at 70 N were: full zirconia, Atlantis 2 × 10(7) and Straumann 3 × 10(7); and Ti-based, Glidewell 1 × 10(6) and Nobel 1 × 10(21). Under accelerated conditions (200 N), performance differed significantly: Straumann clearly outperformed Astra (t test, P = .013), and the Glidewell Ti-base abutment also outperformed Atlantis zirconia at 200 N (Nobel ran-out; t test, P = .035). The modified ISO protocol in this study produced failures that were seen clinically. The manufacture matters; differences in design and fabrication that influence performance cannot be discerned clinically.

  3. Design optimization for active twist rotor blades

    NASA Astrophysics Data System (ADS)

    Mok, Ji Won

    This dissertation introduces the process of optimizing active twist rotor blades in the presence of embedded anisotropic piezo-composite actuators. Optimum design of active twist blades is a complex task, since it involves a rich design space with tightly coupled design variables. The study presents the development of an optimization framework for active helicopter rotor blade cross-sectional design. This optimization framework allows for exploring a rich and highly nonlinear design space in order to optimize the active twist rotor blades. Different analytical components are combined in the framework: cross-sectional analysis (UM/VABS), an automated mesh generator, a beam solver (DYMORE), a three-dimensional local strain recovery module, and a gradient based optimizer within MATLAB. Through the mathematical optimization problem, the static twist actuation performance of a blade is maximized while satisfying a series of blade constraints. These constraints are associated with locations of the center of gravity and elastic axis, blade mass per unit span, fundamental rotating blade frequencies, and the blade strength based on local three-dimensional strain fields under worst loading conditions. Through pre-processing, limitations of the proposed process have been studied. When limitations were detected, resolution strategies were proposed. These include mesh overlapping, element distortion, trailing edge tab modeling, electrode modeling and foam implementation of the mesh generator, and the initial point sensibility of the current optimization scheme. Examples demonstrate the effectiveness of this process. Optimization studies were performed on the NASA/Army/MIT ATR blade case. Even though that design was built and shown significant impact in vibration reduction, the proposed optimization process showed that the design could be improved significantly. The second example, based on a model scale of the AH-64D Apache blade, emphasized the capability of this framework to explore the nonlinear design space of complex planform. Especially for this case, detailed design is carried out to make the actual blade manufacturable. The proposed optimization framework is shown to be an effective tool to design high authority active twist blades to reduce vibration in future helicopter rotor blades.

  4. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  5. Students Matter: Quality Measurements in Online Courses

    ERIC Educational Resources Information Center

    Unal, Zafer; Unal, Aslihan

    2016-01-01

    Quality Matters (QM) is a peer review process designed to certify the quality of online courses and online components. It has generated widespread interest and received national recognition for its peer-based approach to quality assurance and continuous improvement in online education. While the entire QM online course design process is…

  6. Experience-based design: from redesigning the system around the patient to co-designing services with the patient.

    PubMed

    Bate, Paul; Robert, Glenn

    2006-10-01

    Involving patients in service improvement and listening and responding to what they say has played a key part in the redesign of healthcare processes over the past five years and more. Patients and users have attended stakeholder events, participated in discovery interviews, completed surveys, mapped healthcare processes and even designed new hospitals with healthcare staff. However, to date efforts have not necessarily focused on the patient's experience, beyond asking what was good and what was not. Questions were not asked to find out details of what the experience was or should be like ("experience" being different from "attitudes") and the information then systematically used to co-design services with patients. Knowledge of the experience, held only by the patient, is unique and precious. In this paper, attention is drawn to the burgeoning discipline of the design sciences and experience-based design, in which the traditional view of the user as a passive recipient of a product or service has begun to give way to the new view of users as integral to the improvement and innovation process.

  7. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  8. A self-study of designing and implementing an inquiry-based chemistry course for elementary education majors

    NASA Astrophysics Data System (ADS)

    Larson, Teresa

    2011-12-01

    This self-study examines my experiences with implementing an inquiry-based version of a chemistry course (Chemistry 299) designed for elementary education majors. The inquiry-based curriculum design and teaching strategies that I implement in Chemistry 299 is the focus of this study. Since my previous education and professional experiences were in the physical sciences, I position myself in this study as a scientist who engages in self-study as a form of professional development for the purpose of developing an inquiry-based curriculum and instructional practices. My research provides an inside perspective of the curriculum development process. This process involves implementing the inquiry-oriented ideas and knowledge I acquired in my graduate studies to design the curriculum and influence my teaching practice. My analysis of the curriculum and my instruction is guided by two questions: What are the strengths and weaknesses of the inquiry-based Chemistry 299 curriculum design? What does the process of developing my inquiry-based teaching practice entail and what makes is challenging? Schwab's (1973) The Practical 3: Translation into Curriculum serves as the theoretical framework for this study because of the emphasis Schwab places on combining theoretical and practical knowledge in the curriculum development process and because of the way he characterizes the curriculum. The findings in this study are separated into curriculum and instruction domains. First, the Chemistry 299 curriculum was designed to make the epistemological practices of scientists "accessible" to students by emphasizing epistemic development with respect to their ideas about scientific inquiry and science learning. Using student learning as a gauge for progress, I identify specific design elements that developed transferable inquiry skills as a means to support scientific literacy and pre-service teacher education. Second, the instruction-related findings built upon the insight I gained through my analysis of the curriculum. The data reveals four areas of inner conflict I dealt with throughout the study that related to underlying beliefs I held about science teaching and learning. The implications of the study position the Chemistry 299 curriculum in the field and speak to issues related to developing science courses for elementary education majors and professional development for scientists.

  9. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to highlight their inappropriateness for what is really needed at the conceptual phase of the design process. The First-Order Process Velocity Cost Model (FOPV) is discussed at length in the next section. This is followed by an application of the FOPV cost model to a generic wing. For designs that have no precedence as far as acquisition costs are concerned, cost data derived from the FOPV cost model may not be accurate enough because of new requirements for shape complexity, material, equipment and precision/tolerance. The concept of Cost Modulus is introduced at this point to compensate for these new burdens on the basic processes. This is treated in section 5. The cost of a design must be conveniently linked to its CAD representation. The interfacing of CAD models and spreadsheets containing the cost equations is the subject of the next section, section 6. The last section of the report is a summary of the progress made so far, and the anticipated research work to be achieved in the future.

  10. Multi-Step Usage of in Vivo Models During Rational Drug Design and Discovery

    PubMed Central

    Williams, Charles H.; Hong, Charles C.

    2011-01-01

    In this article we propose a systematic development method for rational drug design while reviewing paradigms in industry, emerging techniques and technologies in the field. Although the process of drug development today has been accelerated by emergence of computational methodologies, it is a herculean challenge requiring exorbitant resources; and often fails to yield clinically viable results. The current paradigm of target based drug design is often misguided and tends to yield compounds that have poor absorption, distribution, metabolism, and excretion, toxicology (ADMET) properties. Therefore, an in vivo organism based approach allowing for a multidisciplinary inquiry into potent and selective molecules is an excellent place to begin rational drug design. We will review how organisms like the zebrafish and Caenorhabditis elegans can not only be starting points, but can be used at various steps of the drug development process from target identification to pre-clinical trial models. This systems biology based approach paired with the power of computational biology; genetics and developmental biology provide a methodological framework to avoid the pitfalls of traditional target based drug design. PMID:21731440

  11. From laptop to benchtop to bedside: Structure-based Drug Design on Protein Targets

    PubMed Central

    Chen, Lu; Morrow, John K.; Tran, Hoang T.; Phatak, Sharangdhar S.; Du-Cuny, Lei; Zhang, Shuxing

    2013-01-01

    As an important aspect of computer-aided drug design, structure-based drug design brought a new horizon to pharmaceutical development. This in silico method permeates all aspects of drug discovery today, including lead identification, lead optimization, ADMET prediction and drug repurposing. Structure-based drug design has resulted in fruitful successes drug discovery targeting protein-ligand and protein-protein interactions. Meanwhile, challenges, noted by low accuracy and combinatoric issues, may also cause failures. In this review, state-of-the-art techniques for protein modeling (e.g. structure prediction, modeling protein flexibility, etc.), hit identification/optimization (e.g. molecular docking, focused library design, fragment-based design, molecular dynamic, etc.), and polypharmacology design will be discussed. We will explore how structure-based techniques can facilitate the drug discovery process and interplay with other experimental approaches. PMID:22316152

  12. Teaching Ethics as Design

    ERIC Educational Resources Information Center

    Kirkman, Robert; Fu, Katherine; Lee, Bumsoo

    2017-01-01

    This paper introduces an approach to teaching ethics as design in a new course entitled Design Ethics, team-taught by a philosopher and an engineer/designer. The course follows a problem-based learning model in which groups of students work through the phases of the design process on a project for a local client, considering the design values and…

  13. Harnessing the Potential of Additive Manufacturing

    DTIC Science & Technology

    2016-12-01

    manufacturing age, which is dominated by standards for materials, processes and process control. Conventional manufacturing is based upon a design that is...documented either in a drawing or a computer-aided design (CAD) file. The manufacturing team then develops a docu- mented public or private process for...31 Defense AT&L: November-December 2016 Harnessing the Potential of Additive Manufacturing Bill Decker Decker is director of Technology

  14. Process Development for the Design and Manufacturing of Personalizable Mouth Sticks.

    PubMed

    Berger, Veronika M; Pölzer, Stephan; Nussbaum, Gerhard; Ernst, Waltraud; Major, Zoltan

    2017-01-01

    To increase the independence of people with reduced hand/arm functionality, a process to generate personalizable mouth sticks was developed based on the participatory design principle. In a web tool, anybody can choose the geometry and the materials of their mouth piece, stick and tip. Manufacturing techniques (e.g. 3D printing) and materials used in the process are discussed and evaluated.

  15. The Tool for Designing Engineering Systems Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    The conventional optimization methods were based on a deterministic approach, since their purpose is to find out an exact solution. However, these methods have initial condition dependence and risk of falling into local solution. In this paper, we propose a new optimization method based on a concept of path integral method used in quantum mechanics. The method obtains a solutions as an expected value (stochastic average) using a stochastic process. The advantages of this method are not to be affected by initial conditions and not to need techniques based on experiences. We applied the new optimization method to a design of the hang glider. In this problem, not only the hang glider design but also its flight trajectory were optimized. The numerical calculation results showed that the method has a sufficient performance.

  16. Eight-step method to build the clinical content of an evidence-based care pathway: the case for COPD exacerbation

    PubMed Central

    2012-01-01

    Background Optimization of the clinical care process by integration of evidence-based knowledge is one of the active components in care pathways. When studying the impact of a care pathway by using a cluster-randomized design, standardization of the care pathway intervention is crucial. This methodology paper describes the development of the clinical content of an evidence-based care pathway for in-hospital management of chronic obstructive pulmonary disease (COPD) exacerbation in the context of a cluster-randomized controlled trial (cRCT) on care pathway effectiveness. Methods The clinical content of a care pathway for COPD exacerbation was developed based on recognized process design and guideline development methods. Subsequently, based on the COPD case study, a generalized eight-step method was designed to support the development of the clinical content of an evidence-based care pathway. Results A set of 38 evidence-based key interventions and a set of 24 process and 15 outcome indicators were developed in eight different steps. Nine Belgian multidisciplinary teams piloted both the set of key interventions and indicators. The key intervention set was judged by the teams as being valid and clinically applicable. In addition, the pilot study showed that the indicators were feasible for the involved clinicians and patients. Conclusions The set of 38 key interventions and the set of process and outcome indicators were found to be appropriate for the development and standardization of the clinical content of the COPD care pathway in the context of a cRCT on pathway effectiveness. The developed eight-step method may facilitate multidisciplinary teams caring for other patient populations in designing the clinical content of their future care pathways. PMID:23190552

  17. Bridging CALL & HCI: Input from Participatory Design

    ERIC Educational Resources Information Center

    Cardenas-Claros, Monica S.; Gruba, Paul A.

    2010-01-01

    Participatory design (PD), or the collaboration between software engineers and end users throughout the design process, may help improve CALL design practices. In this case study, four ESL learners, a software designer, and a language teacher created and evaluated a series of paper prototypes concerning help options in computer-based second…

  18. The Process of Designing for Learning: Understanding University Teachers' Design Work

    ERIC Educational Resources Information Center

    Bennett, Sue; Agostinho, Shirley; Lockyer, Lori

    2017-01-01

    Interest in how to support the design work of university teachers has led to research and development initiatives that include technology-based design-support tools, online repositories, and technical specifications. Despite these initiatives, remarkably little is known about the design work that university teachers actually do. This paper…

  19. [Research advances in secondary development of Chinese patent medicines based on quality by design concept].

    PubMed

    Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin

    2017-03-01

    Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.

  20. How do Small Groups Promote Behaviour Change? An Integrative Conceptual Review of Explanatory Mechanisms.

    PubMed

    Borek, Aleksandra J; Abraham, Charles

    2018-03-01

    Small groups are used to promote health, well-being, and personal change by altering members' perceptions, beliefs, expectations, and behaviour patterns. An extensive cross-disciplinary literature has articulated and tested theories explaining how such groups develop, function, and facilitate change. Yet these theoretical understandings are rarely applied in the development, description, and evaluation of health-promotion, group-based, behaviour-change interventions. Medline database, library catalogues, search engines, specific journals and reference lists were searched for relevant texts. Texts were reviewed for explanatory concepts or theories describing change processes in groups, which were integrated into the developing conceptual structure. This was designed to be a parsimonious conceptual framework that could be applied to design and delivery. Five categories of interacting processes and concepts were identified and defined: (1) group development processes, (2) dynamic group processes, (3) social change processes, (4) personal change processes, and (5) group design and operating parameters. Each of these categories encompasses a variety of theorised mechanisms explaining individual change in small groups. The final conceptual model, together with the design issues and practical recommendations derived from it, provides a practical basis for linking research and theory explaining group functioning to optimal design of group-based, behaviour-change interventions. © 2018 The Authors. Applied Psychology: Health and Well-Being published by John Wiley & Sons Ltd on behalf of International Association of Applied Psychology.

  1. Hadoop-based implementation of processing medical diagnostic records for visual patient system

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo

    2018-03-01

    We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.

  2. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  3. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  4. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  5. Facilitating the Learning Process in Design-Based Learning Practices: An Investigation of Teachers' Actions in Supervising Students

    ERIC Educational Resources Information Center

    Gómez Puente, S. M.; van Eijck, M.; Jochems, W.

    2013-01-01

    Background: In research on design-based learning (DBL), inadequate attention is paid to the role the teacher plays in supervising students in gathering and applying knowledge to design artifacts, systems, and innovative solutions in higher education. Purpose: In this study, we examine whether teacher actions we previously identified in the DBL…

  6. The Interface Design and the Usability Testing of a Fossilization Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Wang, Shiang-Kwei; Yang, Chiachi

    2005-01-01

    This article describes practical issues related to the design and the development of a Web-Based Learning Environment (Web-LE) for high school students. The purpose of the Fossilization Web-LE was to help students understand the process of fossilization, which is a complex phenomenon and is affected by many factors. The instructional design team…

  7. Design, Participation, and Social Change: What Design in Grassroots Spaces Can Teach Learning Scientists

    ERIC Educational Resources Information Center

    Zavala, Miguel

    2016-01-01

    While a science of design (and theory of learning) is certainly useful in design-based research, a participatory design research framework presents an opening for learning scientists to rethink design and learning as processes. Grounded in the autoethnographic investigation of a grassroots organization's design of a local campaign, the author…

  8. The wave-based substructuring approach for the efficient description of interface dynamics in substructuring

    NASA Astrophysics Data System (ADS)

    Donders, S.; Pluymers, B.; Ragnarsson, P.; Hadjit, R.; Desmet, W.

    2010-04-01

    In the vehicle design process, design decisions are more and more based on virtual prototypes. Due to competitive and regulatory pressure, vehicle manufacturers are forced to improve product quality, to reduce time-to-market and to launch an increasing number of design variants on the global market. To speed up the design iteration process, substructuring and component mode synthesis (CMS) methods are commonly used, involving the analysis of substructure models and the synthesis of the substructure analysis results. Substructuring and CMS enable efficient decentralized collaboration across departments and allow to benefit from the availability of parallel computing environments. However, traditional CMS methods become prohibitively inefficient when substructures are coupled along large interfaces, i.e. with a large number of degrees of freedom (DOFs) at the interface between substructures. The reason is that the analysis of substructures involves the calculation of a number of enrichment vectors, one for each interface degree of freedom (DOF). Since large interfaces are common in vehicles (e.g. the continuous line connections to connect the body with the windshield, roof or floor), this interface bottleneck poses a clear limitation in the vehicle noise, vibration and harshness (NVH) design process. Therefore there is a need to describe the interface dynamics more efficiently. This paper presents a wave-based substructuring (WBS) approach, which allows reducing the interface representation between substructures in an assembly by expressing the interface DOFs in terms of a limited set of basis functions ("waves"). As the number of basis functions can be much lower than the number of interface DOFs, this greatly facilitates the substructure analysis procedure and results in faster design predictions. The waves are calculated once from a full nominal assembly analysis, but these nominal waves can be re-used for the assembly of modified components. The WBS approach thus enables efficient structural modification predictions of the global modes, so that efficient vibro-acoustic design modification, optimization and robust design become possible. The results show that wave-based substructuring offers a clear benefit for vehicle design modifications, by improving both the speed of component reduction processes and the efficiency and accuracy of design iteration predictions, as compared to conventional substructuring approaches.

  9. Rethinking Race and Power in Design-Based Research: Reflections from the Field

    ERIC Educational Resources Information Center

    Vakil, Sepehr; McKinney de Royston, Maxine; Suad Nasir, Na'ilah; Kirshner, Ben

    2016-01-01

    Participatory design-based research continues to expand and challenge the "researcher" and "researched" paradigm by incorporating teachers, administrators, community members, and youth throughout the research process. Yet, greater clarity is needed about the racial and political dimensions of these collaborative research…

  10. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Joshua M.

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address thismore » problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.« less

  11. Structure-Based Virtual Screening for Drug Discovery: Principles, Applications and Recent Advances

    PubMed Central

    Lionta, Evanthia; Spyrou, George; Vassilatis, Demetrios K.; Cournia, Zoe

    2014-01-01

    Structure-based drug discovery (SBDD) is becoming an essential tool in assisting fast and cost-efficient lead discovery and optimization. The application of rational, structure-based drug design is proven to be more efficient than the traditional way of drug discovery since it aims to understand the molecular basis of a disease and utilizes the knowledge of the three-dimensional structure of the biological target in the process. In this review, we focus on the principles and applications of Virtual Screening (VS) within the context of SBDD and examine different procedures ranging from the initial stages of the process that include receptor and library pre-processing, to docking, scoring and post-processing of topscoring hits. Recent improvements in structure-based virtual screening (SBVS) efficiency through ensemble docking, induced fit and consensus docking are also discussed. The review highlights advances in the field within the framework of several success studies that have led to nM inhibition directly from VS and provides recent trends in library design as well as discusses limitations of the method. Applications of SBVS in the design of substrates for engineered proteins that enable the discovery of new metabolic and signal transduction pathways and the design of inhibitors of multifunctional proteins are also reviewed. Finally, we contribute two promising VS protocols recently developed by us that aim to increase inhibitor selectivity. In the first protocol, we describe the discovery of micromolar inhibitors through SBVS designed to inhibit the mutant H1047R PI3Kα kinase. Second, we discuss a strategy for the identification of selective binders for the RXRα nuclear receptor. In this protocol, a set of target structures is constructed for ensemble docking based on binding site shape characterization and clustering, aiming to enhance the hit rate of selective inhibitors for the desired protein target through the SBVS process. PMID:25262799

  12. What can paper-based clinical information systems tell us about the design of computerized clinical information systems (CIS) in the ICU?

    PubMed

    Miller, A; Pilcher, D; Mercaldo, N; Leong, T; Scheinkestel, C; Schildcrout, J

    2010-08-01

    Screen designs in computerized clinical information systems (CIS) have been modeled on their paper predecessors. However, limited understanding about how paper forms support clinical work means that we risk repeating old mistakes and creating new opportunities for error and inefficiency as illustrated by problems associated with computerized provider order entry systems. This study was designed to elucidate principles underlying a successful ICU paper-based CIS. The research was guided by two exploratory hypotheses: (1) paper-based artefacts (charts, notes, equipment, order forms) are used differently by nurses, doctors and other healthcare professionals in different (formal and informal) conversation contexts and (2) different artefacts support different decision processes that are distributed across role-based conversations. All conversations undertaken at the bedsides of five patients were recorded with any supporting artefacts for five days per patient. Data was coded according to conversational role-holders, clinical decision process, conversational context and artefacts. 2133 data points were analyzed using Poisson logistic regression analyses. Results show significant interactions between artefacts used during different professional conversations in different contexts (chi(2)((df=16))=55.8, p<0.0001). The interaction between artefacts used during different professional conversations for different clinical decision processes was not statistically significant although all two-way interactions were statistically significant. Paper-based CIS have evolved to support complex interdisciplinary decision processes. The translation of two design principles - support interdisciplinary perspectives and integrate decision processes - from paper to computerized CIS may minimize the risks associated with computerization. 2010 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.

  13. Towards automatic planning for manufacturing generative processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CALTON,TERRI L.

    2000-05-24

    Generative process planning describes methods process engineers use to modify manufacturing/process plans after designs are complete. A completed design may be the result from the introduction of a new product based on an old design, an assembly upgrade, or modified product designs used for a family of similar products. An engineer designs an assembly and then creates plans capturing manufacturing processes, including assembly sequences, component joining methods, part costs, labor costs, etc. When new products originate as a result of an upgrade, component geometry may change, and/or additional components and subassemblies may be added to or are omitted from themore » original design. As a result process engineers are forced to create new plans. This is further complicated by the fact that the process engineer is forced to manually generate these plans for each product upgrade. To generate new assembly plans for product upgrades, engineers must manually re-specify the manufacturing plan selection criteria and re-run the planners. To remedy this problem, special-purpose assembly planning algorithms have been developed to automatically recognize design modifications and automatically apply previously defined manufacturing plan selection criteria and constraints.« less

  14. Aerospace Engineering Systems

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: Physics-based analysis tools for filling the design space database; Distributed computational resources to reduce response time and cost; Web-based technologies to relieve machine-dependence; and Artificial intelligence technologies to accelerate processes and reduce process variability. Activities such as the Advanced Design Technologies Testbed (ADTT) project at NASA Ames Research Center study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities will be reported.

  15. Mitigation of Engine Inlet Distortion Through Adjoint-Based Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Rallabhandi, Sriram; Nielsen, Eric J.; Diskin, Boris

    2017-01-01

    The adjoint-based design capability in FUN3D is extended to allow efficient gradient- based optimization and design of concepts with highly integrated aero-propulsive systems. A circumferential distortion calculation, along with the derivatives needed to perform adjoint-based design, have been implemented in FUN3D. This newly implemented distortion calculation can be used not only for design but also to drive the existing mesh adaptation process and reduce the error associated with the fan distortion calculation. The design capability is demonstrated by the shape optimization of an in-house aircraft concept equipped with an aft fuselage propulsor. The optimization objective is the minimization of flow distortion at the aerodynamic interface plane of this aft fuselage propulsor.

  16. Design and fabrication of conventional and unconventional superconductors

    NASA Technical Reports Server (NTRS)

    Collings, E. W.

    1983-01-01

    The design and fabrication of conventional and unconventionally processed Ti-Nb base and Al5-compound-base, respectively, composite superconductors is discussed in a nine section review. The first two sections introduce the general properties of alloy and compound superconductors, and the design and processing requirements for the production of long lengths of stable low loss conductor. All aspects of flux jump stability, and the general requirements of cryogenic stabilization are addressed. Conductor design from an a.c.-loss standpoint; some basic formulae describing hysteretic and eddy current losses and the influences on a.c. loss of filament diameter, strand (conductor) diameter, twist pitch, and matrix resistivity are discussed. The basic techniques used in the fabrication of conventional multifilamentary conductors are described.

  17. Enhanced teaching and student learning through a simulator-based course in chemical unit operations design

    NASA Astrophysics Data System (ADS)

    Ghasem, Nayef

    2016-07-01

    This paper illustrates a teaching technique used in computer applications in chemical engineering employed for designing various unit operation processes, where the students learn about unit operations by designing them. The aim of the course is not to teach design, but rather to teach the fundamentals and the function of unit operation processes through simulators. A case study presenting the teaching method was evaluated using student surveys and faculty assessments, which were designed to measure the quality and effectiveness of the teaching method. The results of the questionnaire conclusively demonstrate that this method is an extremely efficient way of teaching a simulator-based course. In addition to that, this teaching method can easily be generalised and used in other courses. A student's final mark is determined by a combination of in-class assessments conducted based on cooperative and peer learning, progress tests and a final exam. Results revealed that peer learning can improve the overall quality of student learning and enhance student understanding.

  18. Empowering biomedical engineering undergraduates to help teach design.

    PubMed

    Allen, Robert H; Tam, William; Shoukas, Artin A

    2004-01-01

    We report on our experience empowering upperclassmen and seniors to help teach design courses in biomedical engineering. Initiated in the fall of 1998, these courses are a projects-based set, where teams of students from freshmen level to senior level converge to solve practical problems in biomedical engineering. One goal in these courses is to teach the design process by providing experiences that mimic it. Student teams solve practical projects solicited from faculty, industry and the local community. To hone skills and have a metric for grading, written documentation, posters and oral presentations are required over the two-semester sequence. By requiring a mock design and build exercise in the fall, students appreciate the manufacturing process, the difficulties unforeseen in the design stage and the importance of testing. A Web-based, searchable design repository captures reporting information from each project since its inception. This serves as a resource for future projects, in addition to traditional ones such as library, outside experts and lab facilities. Based on results to date, we conclude that characteristics about our design program help students experience design and learn aspects about teamwork and mentoring useful in their profession or graduate education.

  19. A rule based computer aided design system

    NASA Technical Reports Server (NTRS)

    Premack, T.

    1986-01-01

    A Computer Aided Design (CAD) system is presented which supports the iterative process of design, the dimensional continuity between mating parts, and the hierarchical structure of the parts in their assembled configuration. Prolog, an interactive logic programming language, is used to represent and interpret the data base. The solid geometry representing the parts is defined in parameterized form using the swept volume method. The system is demonstrated with a design of a spring piston.

  20. Program Helps Decompose Complicated Design Problems

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.

    1993-01-01

    Time saved by intelligent decomposition into smaller, interrelated problems. DeMAID is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problem. Displays modules in N x N matrix format. Requires investment of time to generate and refine list of modules for input, it saves considerable amount of money and time in total design process, particularly new design problems in which ordering of modules has not been defined. Program also implemented to examine assembly-line process or ordering of tasks and milestones.

  1. Inverse problems in the design, modeling and testing of engineering systems

    NASA Technical Reports Server (NTRS)

    Alifanov, Oleg M.

    1991-01-01

    Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.

  2. Life cycle design and design management strategies in fashion apparel manufacturing

    NASA Astrophysics Data System (ADS)

    Tutia, R.; Mendes, FD; Ventura, A.

    2017-10-01

    The generation of solid textile waste in the process of development and clothing production is an error that causes serious damages to the environment and must be minimized. The greatest volume of textile residues is generated by the department of cut, such as textiles parings and snips that are not used in the productive process. (MILAN et al, 2007). One way to conceive new products environmently conscious is turned to the adoption of a methodology based on Life Cycle Design (LCD) and Design Management.

  3. Design and Optimization of Composite Automotive Hatchback Using Integrated Material-Structure-Process-Performance Method

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Sun, Lingyu; Zhang, Cheng; Li, Lijun; Dai, Zongmiao; Xiong, Zhenkai

    2018-03-01

    The application of polymer composites as a substitution of metal is an effective approach to reduce vehicle weight. However, the final performance of composite structures is determined not only by the material types, structural designs and manufacturing process, but also by their mutual restrict. Hence, an integrated "material-structure-process-performance" method is proposed for the conceptual and detail design of composite components. The material selection is based on the principle of composite mechanics such as rule of mixture for laminate. The design of component geometry, dimension and stacking sequence is determined by parametric modeling and size optimization. The selection of process parameters are based on multi-physical field simulation. The stiffness and modal constraint conditions were obtained from the numerical analysis of metal benchmark under typical load conditions. The optimal design was found by multi-discipline optimization. Finally, the proposed method was validated by an application case of automotive hatchback using carbon fiber reinforced polymer. Compared with the metal benchmark, the weight of composite one reduces 38.8%, simultaneously, its torsion and bending stiffness increases 3.75% and 33.23%, respectively, and the first frequency also increases 44.78%.

  4. The Development of Online Tutorial Program Design Using Problem-Based Learning in Open Distance Learning System

    ERIC Educational Resources Information Center

    Said, Asnah; Syarif, Edy

    2016-01-01

    This research aimed to evaluate of online tutorial program design by applying problem-based learning Research Methods currently implemented in the system of Open Distance Learning (ODL). The students must take a Research Methods course to prepare themselves for academic writing projects. Problem-based learning basically emphasizes the process of…

  5. Using innovative instructional technology to meet training needs in public health: a design process.

    PubMed

    Millery, Mari; Hall, Michelle; Eisman, Joanna; Murrman, Marita

    2014-03-01

    Technology and distance learning can potentially enhance the efficient and effective delivery of continuing education to the public health workforce. Public Health Training Centers collaborate with instructional technology designers to develop innovative, competency-based online learning experiences that meet pressing training needs and promote best practices. We describe one Public Health Training Center's online learning module design process, which consists of five steps: (1) identify training needs and priority competencies; (2) define learning objectives and identify educational challenges; (3) pose hypotheses and explore innovative, technology-based solutions; (4) develop and deploy the educational experience; and (5) evaluate feedback and outcomes to inform continued cycles of revision and improvement. Examples illustrate the model's application. These steps are discussed within the context of design practices in the fields of education, engineering, and public health. They incorporate key strategies from across these fields, including principles of programmatic design familiar to public health professionals, such as backward design. The instructional technology design process we describe provides a structure for the creativity, collaboration, and systematic strategies needed to develop online learning products that address critical training needs for the public health workforce.

  6. Innovation and design of a web-based pain education interprofessional resource.

    PubMed

    Lax, Leila; Watt-Watson, Judy; Lui, Michelle; Dubrowski, Adam; McGillion, Michael; Hunter, Judith; Maclennan, Cameron; Knickle, Kerry; Robb, Anja; Lapeyre, Jaime

    2011-01-01

    The present article describes educational innovation processes and design of a web-based pain interprofessional resource for prelicensure health science students in universities across Canada. Operationalization of educational theory in design coupled with formative evaluation of design are discussed, along with strategies that support collaborative innovation. Educational design was driven by content, theory and evaluation. Pain misbeliefs and teaching points along the continuum from acute to persistent pain were identified. Knowledge-building theory, situated learning, reflection and novel designs for cognitive scaffolding were then employed. Design research principles were incorporated to inform iterative and ongoing design. An authentic patient case was constructed, situated in interprofessional complex care to highlight learning objectives related to pre-operative, postoperative and treatment up to one year, for a surgical cancer patient. Pain mechanisms, assessment and management framed content creation. Knowledge building scaffolds were used, which included video simulations, embedded resources, concurrent feedback, practice-based reflective exercises and commentaries. Scaffolds were refined to specifically support knowledge translation. Illustrative commentaries were designed to explicate pain misbeliefs and best practices. Architecture of the resource was mapped; a multimedia, interactive prototype was created. This pain education resource was developed primarily for individual use, with extensions for interprofessional collective discourse. Translation of curricular content scripts into representation maps supported the collaborative design process by establishing a common visual language. The web-based prototype will be formatively and summatively evaluated to assess pedagogic design, knowledge-translation scaffolds, pain knowledge gains, relevance, feasibility and fidelity of this educational innovation.

  7. An Educational Technology Curriculum for Converging Technologies.

    ERIC Educational Resources Information Center

    Allen, Brockenbrough S.; And Others

    1989-01-01

    Outlines curriculum reforms being made in the master's level educational technology program at San Diego State University. Topics discussed include technological changes and the roles of educational product designers; human information processing; knowledge base design; student design of educational adventure games; interactive video design; and…

  8. A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.

    PubMed

    Janols, Rebecka; Lindgren, Helena

    2017-01-01

    A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.

  9. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  10. Conceptual design of flapping-wing micro air vehicles.

    PubMed

    Whitney, J P; Wood, R J

    2012-09-01

    Traditional micro air vehicles (MAVs) are miniature versions of full-scale aircraft from which their design principles closely follow. The first step in aircraft design is the development of a conceptual design, where basic specifications and vehicle size are established. Conceptual design methods do not rely on specific knowledge of the propulsion system, vehicle layout and subsystems; these details are addressed later in the design process. Non-traditional MAV designs based on birds or insects are less common and without well-established conceptual design methods. This paper presents a conceptual design process for hovering flapping-wing vehicles. An energy-based accounting of propulsion and aerodynamics is combined with a one degree-of-freedom dynamic flapping model. Important results include simple analytical expressions for flight endurance and range, predictions for maximum feasible wing size and body mass, and critical design space restrictions resulting from finite wing inertia. A new figure-of-merit for wing structural-inertial efficiency is proposed and used to quantify the performance of real and artificial insect wings. The impact of these results on future flapping-wing MAV designs is discussed in detail.

  11. Designing for Cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1991-01-01

    Designing for cost is a state of mind. Of course, a lot of technical knowledge is required and the use of appropriate tools will improve the process. Unfortunately, the extensive use of weight based cost estimating relationships has generated a perception in the aerospace community that the primary way to reduce cost is to reduce weight. Wrong! Based upon an approximation of an industry accepted formula, the PRICE H (tm) production-production equation, Dean demonstrated theoretically that the optimal trajectory for cost reduction is predominantly in the direction of system complexity reduction, not system weight reduction. Thus the phrase "keep it simple" is a primary state of mind required for reducing cost throughout the design process.

  12. A software development and evolution model based on decision-making

    NASA Technical Reports Server (NTRS)

    Wild, J. Christian; Dong, Jinghuan; Maly, Kurt

    1991-01-01

    Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.

  13. Adaptive design lessons from professional architects

    NASA Astrophysics Data System (ADS)

    Geiger, Ray W.; Snell, J. T.

    1993-09-01

    Psychocybernetic systems engineering design conceptualization is mimicking the evolutionary path of habitable environmental design and the professional practice of building architecture, construction, and facilities management. In pursuing better ways to design cellular automata and qualification classifiers in a design process, we have found surprising success in exploring certain more esoteric approaches, e.g., the vision of interdisciplinary artistic discovery in and around creative problem solving. Our evaluation in research into vision and hybrid sensory systems associated with environmental design and human factors has led us to discover very specific connections between the human spirit and quality design. We would like to share those very qualitative and quantitative parameters of engineering design, particularly as it relates to multi-faceted and future oriented design practice. Discussion covers areas of case- based techniques of cognitive ergonomics, natural modeling sources, and an open architectural process of means/goal satisfaction, qualified by natural repetition, gradation, rhythm, contrast, balance, and integrity of process.

  14. A flexible computer aid for conceptual design based on constraint propagation and component-modeling. [of aircraft in three dimensions

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1988-01-01

    The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.

  15. Specification, Design, and Analysis of Advanced HUMS Architectures

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    2004-01-01

    During the two-year project period, we have worked on several aspects of domain-specific architectures for HUMS. In particular, we looked at using scenario-based approach for the design and designed a language for describing such architectures. The language is now being used in all aspects of our HUMS design. In particular, we have made contributions in the following areas. 1) We have employed scenarios in the development of HUMS in three main areas. They are: (a) To improve reusability by using scenarios as a library indexing tool and as a domain analysis tool; (b) To improve maintainability by recording design rationales from two perspectives - problem domain and solution domain; (c) To evaluate the software architecture. 2) We have defined a new architectural language called HADL or HUMS Architectural Definition Language. It is a customized version of xArch/xADL. It is based on XML and, hence, is easily portable from domain to domain, application to application, and machine to machine. Specifications written in HADL can be easily read and parsed using the currently available XML parsers. Thus, there is no need to develop a plethora of software to support HADL. 3) We have developed an automated design process that involves two main techniques: (a) Selection of solutions from a large space of designs; (b) Synthesis of designs. However, the automation process is not an absolute Artificial Intelligence (AI) approach though it uses a knowledge-based system that epitomizes a specific HUMS domain. The process uses a database of solutions as an aid to solve the problems rather than creating a new design in the literal sense. Since searching is adopted as the main technique, the challenges involved are: (a) To minimize the effort in searching the database where a very large number of possibilities exist; (b) To develop representations that could conveniently allow us to depict design knowledge evolved over many years; (c) To capture the required information that aid the automation process.

  16. Tribology symposium 1995. PD-Volume 72

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masudi, H.

    After the keynote presentation by Professor Aaron Cohen of Texas A and M University, entitled Processes Used in Design, the program is divided into five major sessions: Research and Development -- Recent research and development of tribological components; Tribology in Manufacturing -- The impact of tribology on modern manufacturing; Design/Design Representation -- Aspects of design related to tribological systems; Tribo-Chemistry/Tribo-Physics -- Discussion of chemical and physical behavior of substances as related to tribology; and Failure Analysis -- An analysis of failure, failure detection, and failure monitoring as related to manufacturing processes. Papers have been processed separately for inclusion on themore » data base.« less

  17. The CAL: cognitive, apperceptive and representative aspects of fashion design - Side note to neuroaesthetic theory

    NASA Astrophysics Data System (ADS)

    Csanák, Edit

    2017-10-01

    This article deals with the creative and cognitive process of the creative work from the aspect of fashion design. It is examined through the Cognitive-Apperceptive-Limn process (The CAL), analysing the stages of design work referring to prominent literature, and discussing exciting theories, such the FLOW and the AHA effects are, and the neuro aesthetic theory. Setting them into a new context, the article offers a fresh approach of a designer, rather than a scientific statement based on pragmatic findings. Since theories on artistic performance and creativity can never be enough ‘empirical’, and the process can be never understood enough well…

  18. A Cultured Learning Environment: Implementing a Problem- and Service-Based Microbiology Capstone Course to Assess Process- and Skill-Based Learning Objectives

    ERIC Educational Resources Information Center

    Watson, Rachel M.; Willford, John D.; Pfeifer, Mariel A.

    2018-01-01

    In this study, a problem-based capstone course was designed to assess the University of Wyoming Microbiology Program's skill-based and process-based student learning objectives. Students partnered with a local farm, a community garden, and a free downtown clinic in order to conceptualize, propose, perform, and present studies addressing problems…

  19. Reference Model for Project Support Environments Version 1.0

    DTIC Science & Technology

    1993-02-28

    relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data

  20. Development of a multisensor-based bio-botanic robot and its implementation using a self-designed embedded board.

    PubMed

    Chang, Chung-Liang; Sie, Ming-Fong; Shie, Jin-Long

    2011-01-01

    This paper presents the design concept of a bio-botanic robot which demonstrates its behavior based on plant growth. Besides, it can reflect the different phases of plant growth depending on the proportional amounts of light, temperature and water. The mechanism design is made up of a processed aluminum base, spring, polydimethylsiloxane (PDMS) and actuator to constitute the plant base and plant body. The control system consists of two micro-controllers and a self-designed embedded development board where the main controller transmits the values of the environmental sensing module within the embedded board to a sub-controller. The sub-controller determines the growth stage, growth height, and time and transmits its decision value to the main controller. Finally, based on the data transmitted by the sub-controller, the main controller controls the growth phase of the bio-botanic robot using a servo motor and leaf actuator. The research result not only helps children realize the variation of plant growth but also is entertainment-educational through its demonstration of the growth process of the bio-botanic robot in a short time.

  1. CMOS based capacitance to digital converter circuit for MEMS sensor

    NASA Astrophysics Data System (ADS)

    Rotake, D. R.; Darji, A. D.

    2018-02-01

    Most of the MEMS cantilever based system required costly instruments for characterization, processing and also has large experimental setups which led to non-portable device. So there is a need of low cost, highly sensitive, high speed and portable digital system. The proposed Capacitance to Digital Converter (CDC) interfacing circuit converts capacitance to digital domain which can be easily processed. Recent demand microcantilever deflection is part per trillion ranges which change the capacitance in 1-10 femto farad (fF) range. The entire CDC circuit is designed using CMOS 250nm technology. Design of CDC circuit consists of a D-latch and two oscillators, namely Sensor controlled oscillator (SCO) and digitally controlled oscillator (DCO). The D-latch is designed using transmission gate based MUX for power optimization. A CDC design of 7-stage, 9-stage and 11-stage tested for 1-18 fF and simulated using mentor graphics Eldo tool with parasitic. Since the proposed design does not use resistance component, the total power dissipation is reduced to 2.3621 mW for CDC designed using 9-stage SCO and DCO.

  2. Nanohole Arrays of Mixed Designs and Microwriting for Simultaneous and Multiple Protein Binding Studies

    PubMed Central

    Ji, Jin; Yang, Jiun-Chan; Larson, Dale N.

    2009-01-01

    We demonstrate using nanohole arrays of mixed designs and a microwriting process based on dip-pen nanolithography to monitor multiple, different protein binding events simultaneously in real time based on the intensity of Extraordinary Optical Transmission of nanohole arrays. The microwriting process and small footprint of the individual nanohole arrays enabled us to observe different binding events located only 16μm apart, achieving high spatial resolution. We also present a novel concept that incorporates nanohole arrays of different designs to improve confidence and accuracy of binding studies. For proof of concept, two types of nanohole arrays, designed to exhibit opposite responses to protein bindings, were fabricated on one transducer. Initial studies indicate that the mixed designs could help to screen out artifacts such as protein intrinsic signals, providing improved accuracy of binding interpretation. PMID:19297143

  3. An ultra low power ECG signal processor design for cardiovascular disease detection.

    PubMed

    Jain, Sanjeev Kumar; Bhaumik, Basabi

    2015-08-01

    This paper presents an ultra low power ASIC design based on a new cardiovascular disease diagnostic algorithm. This new algorithm based on forward search is designed for real time ECG signal processing. The algorithm is evaluated for Physionet PTB database from the point of view of cardiovascular disease diagnosis. The failed detection rate of QRS complex peak detection of our algorithm ranges from 0.07% to 0.26% for multi lead ECG signal. The ASIC is designed using 130-nm CMOS low leakage process technology. The area of ASIC is 1.21 mm(2). This ASIC consumes only 96 nW at an operating frequency of 1 kHz with a supply voltage of 0.9 V. Due to ultra low power consumption, our proposed ASIC design is most suitable for energy efficient wearable ECG monitoring devices.

  4. Glass sealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brow, R.K.; Kovacic, L.; Chambers, R.S.

    1996-04-01

    Hernetic glass sealing technologies developed for weapons component applications can be utilized for the design and manufacture of fuel cells. Design and processing of of a seal are optimized through an integrated approach based on glass composition research, finite element analysis, and sealing process definition. Glass sealing procedures are selected to accommodate the limits imposed by glass composition and predicted calculations.

  5. An Academic Development Model for Fostering Innovation and Sharing in Curriculum Design

    ERIC Educational Resources Information Center

    Dempster, Jacqueline A.; Benfield, Greg; Francis, Richard

    2012-01-01

    This paper outlines an academic development process based around a two- or three-day workshop programme called a Course Design Intensive (CDI). The CDI process aims to foster collaboration and peer support in curriculum development and bring about pedagogic innovation and positive experiences for both tutors and learners. Bringing participants…

  6. Idea-Based Learning: A Course Design Process to Promote Conceptual Understanding

    ERIC Educational Resources Information Center

    Hansen , Edmund J.

    2011-01-01

    Synthesizing the best current thinking about learning, course design, and promoting student achievement, this is a guide to developing college instruction that has clear purpose, is well integrated into the curriculum, and improves student learning in predictable and measurable ways. The process involves developing a transparent course blueprint,…

  7. Effects of Thinking Style on Design Strategies: Using Bridge Construction Simulation Programs

    ERIC Educational Resources Information Center

    Sun, Chuen-Tsai; Wang, Dai-Yi; Chang, Yu-Yeh

    2013-01-01

    Computer simulation users can freely control operational factors and simulation results, repeat processes, make changes, and learn from simulation environment feedback. The focus of this paper is on simulation-based design tools and their effects on student learning processes in a group of 101 Taiwanese senior high school students. Participants…

  8. A Process Chart to Design Experiential Learning Projects

    ERIC Educational Resources Information Center

    Zhu, Suning; Wu, Yun; Sankar, Chetan S.

    2016-01-01

    A high-impact practice is to incorporate experiential learning projects when teaching difficulty subject matters so as to enhance students' understanding and interest in the course content. But, there is limited research on how to design and execute such projects. Therefore, we propose a framework based on the processes described by the Project…

  9. Learning Tasks, Peer Interaction, and Cognition Process: An Online Collaborative Design Model

    ERIC Educational Resources Information Center

    Du, Jianxia; Durrington, Vance A.

    2013-01-01

    This paper illustrates a model for Online Group Collaborative Learning. The authors based the foundation of the Online Collaborative Design Model upon Piaget's concepts of assimilation and accommodation, and Vygotsky's theory of social interaction. The four components of online collaborative learning include: individual processes, the task(s)…

  10. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  11. Grief-Processing-Based Psychological Intervention for Children Orphaned by AIDS in Central China: A Pilot Study

    ERIC Educational Resources Information Center

    Lin, Xiuyun; Fang, Xiaoyi; Chi, Peilian; Li, Xiaoming; Chen, Wenrui; Heath, Melissa Allen

    2014-01-01

    A group of 124 children orphaned by AIDS (COA), who resided in two orphanages funded by the Chinese government, participated in a study investigating the efficacy of a grief-processing-based psychological group intervention. This psychological intervention program was designed to specifically help COA process their grief and reduce their…

  12. Coaching Process Based on Transformative Learning Theory for Changing the Instructional Mindset of Elementary School Teachers

    ERIC Educational Resources Information Center

    Kawinkamolroj, Milintra; Triwaranyu, Charinee; Thongthew, Sumlee

    2015-01-01

    This research aimed to develop coaching process based on transformative learning theory for changing the mindset about instruction of elementary school teachers. Tools used in this process include mindset tests and questionnaires designed to assess the instructional mindset of teachers and to allow the teachers to reflect on how they perceive…

  13. Auditory Processing Interventions and Developmental Dyslexia: A Comparison of Phonemic and Rhythmic Approaches

    ERIC Educational Resources Information Center

    Thomson, Jennifer M.; Leong, Victoria; Goswami, Usha

    2013-01-01

    The purpose of this study was to compare the efficacy of two auditory processing interventions for developmental dyslexia, one based on rhythm and one based on phonetic training. Thirty-three children with dyslexia participated and were assigned to one of three groups (a) a novel rhythmic processing intervention designed to highlight auditory…

  14. Developmental engineering: a new paradigm for the design and manufacturing of cell-based products. Part II: from genes to networks: tissue engineering from the viewpoint of systems biology and network science.

    PubMed

    Lenas, Petros; Moos, Malcolm; Luyten, Frank P

    2009-12-01

    The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has developed the necessary concepts and methods to describe it, allowing therefore a deeper understanding of the behavior of networks during biomimetic processes. These advances thus open the door to a transition for tissue engineering from a substantially empirical endeavor to a technology-based discipline comparable to other branches of engineering.

  15. Generic Tasks for Knowledge-Based Problem Solving: Extension and New Directions

    DTIC Science & Technology

    1991-02-01

    Report. i 3] D. Brown and B. Chandrasekaran. Design: An information processing level analy- sis. In Design Problem Solving: Knowledge Structures and...generic information processing tasks. In Proceedings of the Internaoional Joint Conference on Artificial Inte!lzjence. IJCAI, 1987. [181 B...Chandrasekaran. What kind of information processing is intelligence? a perspective I on ai paradigms and a proposal. In D. Partridge and Y. Wilks, editors

  16. Optimization of critical quality attributes in continuous twin-screw wet granulation via design space validated with pilot scale experimental data.

    PubMed

    Liu, Huolong; Galbraith, S C; Ricart, Brendon; Stanton, Courtney; Smith-Goettler, Brandye; Verdi, Luke; O'Connor, Thomas; Lee, Sau; Yoon, Seongkyu

    2017-06-15

    In this study, the influence of key process variables (screw speed, throughput and liquid to solid (L/S) ratio) of a continuous twin screw wet granulation (TSWG) was investigated using a central composite face-centered (CCF) experimental design method. Regression models were developed to predict the process responses (motor torque, granule residence time), granule properties (size distribution, volume average diameter, yield, relative width, flowability) and tablet properties (tensile strength). The effects of the three key process variables were analyzed via contour and interaction plots. The experimental results have demonstrated that all the process responses, granule properties and tablet properties are influenced by changing the screw speed, throughput and L/S ratio. The TSWG process was optimized to produce granules with specific volume average diameter of 150μm and the yield of 95% based on the developed regression models. A design space (DS) was built based on volume average granule diameter between 90 and 200μm and the granule yield larger than 75% with a failure probability analysis using Monte Carlo simulations. Validation experiments successfully validated the robustness and accuracy of the DS generated using the CCF experimental design in optimizing a continuous TSWG process. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Design of an MR image processing module on an FPGA chip.

    PubMed

    Li, Limin; Wyrwicz, Alice M

    2015-06-01

    We describe the design and implementation of an image processing module on a single-chip Field-Programmable Gate Array (FPGA) for real-time image processing. We also demonstrate that through graphical coding the design work can be greatly simplified. The processing module is based on a 2D FFT core. Our design is distinguished from previously reported designs in two respects. No off-chip hardware resources are required, which increases portability of the core. Direct matrix transposition usually required for execution of 2D FFT is completely avoided using our newly-designed address generation unit, which saves considerable on-chip block RAMs and clock cycles. The image processing module was tested by reconstructing multi-slice MR images from both phantom and animal data. The tests on static data show that the processing module is capable of reconstructing 128×128 images at speed of 400 frames/second. The tests on simulated real-time streaming data demonstrate that the module works properly under the timing conditions necessary for MRI experiments. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Design and Testing of Novel Lethal Ovitrap to Reduce Populations of Aedes Mosquitoes: Community-Based Participatory Research between Industry, Academia and Communities in Peru and Thailand.

    PubMed

    Paz-Soldan, Valerie A; Yukich, Josh; Soonthorndhada, Amara; Giron, Maziel; Apperson, Charles S; Ponnusamy, Loganathan; Schal, Coby; Morrison, Amy C; Keating, Joseph; Wesson, Dawn M

    2016-01-01

    Dengue virus (and Chikungunya and Zika viruses) is transmitted by Aedes aegypti and Aedes albopictus mosquitoes and causes considerable human morbidity and mortality. As there is currently no vaccine or chemoprophylaxis to protect people from dengue virus infection, vector control is the only viable option for disease prevention. The purpose of this paper is to illustrate the design and placement process for an attractive lethal ovitrap to reduce vector populations and to describe lessons learned in the development of the trap. This study was conducted in 2010 in Iquitos, Peru and Lopburi Province, Thailand and used an iterative community-based participatory approach to adjust design specifications of the trap, based on community members' perceptions and feedback, entomological findings in the lab, and design and research team observations. Multiple focus group discussions (FGD) were held over a 6 month period, stratified by age, sex and motherhood status, to inform the design process. Trap testing transitioned from the lab to within households. Through an iterative process of working with specifications from the research team, findings from the laboratory testing, and feedback from FGD, the design team narrowed trap design options from 22 to 6. Comments from the FGD centered on safety for children and pets interacting with traps, durability, maintenance issues, and aesthetics. Testing in the laboratory involved releasing groups of 50 gravid Ae. aegypti in walk-in rooms and assessing what percentage were caught in traps of different colors, with different trap cover sizes, and placed under lighter or darker locations. Two final trap models were mocked up and tested in homes for a week; one model was the top choice in both Iquitos and Lopburi. The community-based participatory process was essential for the development of novel traps that provided effective vector control, but also met the needs and concerns of community members.

  19. Design and Testing of Novel Lethal Ovitrap to Reduce Populations of Aedes Mosquitoes: Community-Based Participatory Research between Industry, Academia and Communities in Peru and Thailand

    PubMed Central

    Yukich, Josh; Soonthorndhada, Amara; Giron, Maziel; Apperson, Charles S.; Ponnusamy, Loganathan; Schal, Coby; Morrison, Amy C.; Keating, Joseph; Wesson, Dawn M.

    2016-01-01

    Background Dengue virus (and Chikungunya and Zika viruses) is transmitted by Aedes aegypti and Aedes albopictus mosquitoes and causes considerable human morbidity and mortality. As there is currently no vaccine or chemoprophylaxis to protect people from dengue virus infection, vector control is the only viable option for disease prevention. The purpose of this paper is to illustrate the design and placement process for an attractive lethal ovitrap to reduce vector populations and to describe lessons learned in the development of the trap. Methods This study was conducted in 2010 in Iquitos, Peru and Lopburi Province, Thailand and used an iterative community-based participatory approach to adjust design specifications of the trap, based on community members’ perceptions and feedback, entomological findings in the lab, and design and research team observations. Multiple focus group discussions (FGD) were held over a 6 month period, stratified by age, sex and motherhood status, to inform the design process. Trap testing transitioned from the lab to within households. Results Through an iterative process of working with specifications from the research team, findings from the laboratory testing, and feedback from FGD, the design team narrowed trap design options from 22 to 6. Comments from the FGD centered on safety for children and pets interacting with traps, durability, maintenance issues, and aesthetics. Testing in the laboratory involved releasing groups of 50 gravid Ae. aegypti in walk-in rooms and assessing what percentage were caught in traps of different colors, with different trap cover sizes, and placed under lighter or darker locations. Two final trap models were mocked up and tested in homes for a week; one model was the top choice in both Iquitos and Lopburi. Discussion The community-based participatory process was essential for the development of novel traps that provided effective vector control, but also met the needs and concerns of community members. PMID:27532497

  20. Usability engineering for augmented reality: employing user-based studies to inform design.

    PubMed

    Gabbard, Joseph L; Swan, J Edward

    2008-01-01

    A major challenge, and thus opportunity, in the field of human-computer interaction and specifically usability engineering is designing effective user interfaces for emerging technologies that have no established design guidelines or interaction metaphors or introduce completely new ways for users to perceive and interact with technology and the world around them. Clearly, augmented reality is one such emerging technology. We propose a usability engineering approach that employs user-based studies to inform design, by iteratively inserting a series of user-based studies into a traditional usability engineering lifecycle to better inform initial user interface designs. We present an exemplar user-based study conducted to gain insight into how users perceive text in outdoor augmented reality settings and to derive implications for design in outdoor augmented reality. We also describe lessons learned from our experiences conducting user-based studies as part of the design process.

  1. Design of a smart ECG garment based on conductive textile electrode and flexible printed circuit board.

    PubMed

    Cai, Zhipeng; Luo, Kan; Liu, Chengyu; Li, Jianqing

    2017-08-09

    A smart electrocardiogram (ECG) garment system was designed for continuous, non-invasive and comfortable ECG monitoring, which mainly consists of four components: Conductive textile electrode, garment, flexible printed circuit board (FPCB)-based ECG processing module and android application program. Conductive textile electrode and FPCB-based ECG processing module (6.8 g, 55 mm × 53 mm × 5 mm) are identified as two key techniques to improve the system's comfort and flexibility. Preliminary experimental results verified that the textile electrodes with circle shape, 40 mm size in diameter, and 5 mm thickness sponge are best suited for the long-term ECG monitoring application. The tests on the whole system confirmed that the designed smart garment can obtain long-term ECG recordings with high signal quality.

  2. Bayesian experimental design for models with intractable likelihoods.

    PubMed

    Drovandi, Christopher C; Pettitt, Anthony N

    2013-12-01

    In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.

  3. Introducing a New Guided Design into the Classroom.

    ERIC Educational Resources Information Center

    Allen, Charles W.

    Based on a workshop presented by Charles Wales, a guided design project was developed for a junior mechanical design class at California State University-Chico. This course involves lectures on the design process and an extension of the basic mechanics of materials concepts, particularly as related to design and prevention of failure. The…

  4. Lost in translation: bridging gaps between design and evidence-based design.

    PubMed

    Watkins, Nicholas; Keller, Amy

    2008-01-01

    The healthcare design community is adopting evidence-based design (EBD) at a startling rate. However, the role of research within an architectural practice is unclear. Reasons for the lack of clarity include multiple connotations of EBD, the tension between a research-driven market and market-driven research, and the competing expectations and standards of design practitioners and researchers. Research as part of EBD should be integral with the design process so that research directly contributes to building projects. Characteristics of a comprehensive programming methodology to close the gap between design and EBD are suggested.

  5. Quicker, slicker, and better? An evaluation of a web-based human resource management system

    NASA Astrophysics Data System (ADS)

    Gibb, Stephen; McBride, Andrew

    2001-10-01

    This paper reviews the design and development of a web based Human Resource Management (HRM) system which has as its foundation a 'capability profiler' tool for analysing individual or team roles in organisations. This provides a foundation for managing a set of integrated activities in recruitment and selection, performance and career management, and training and development for individuals, teams, and whole organisations. The challenges of representing and processing information about the human side of organisation encountered in the design and implementation of such systems are evident. There is a combination of legal, practical, technical and philosophical issues to be faced in the processes of defining roles, selecting staff, monitoring and managing the performance of employees in the design and implementation of such systems. The strengths and weaknesses of web based systems in this context are evaluated. This evaluation highlights both the potential, given the evolution of broader Enterprise Resource Planning (ERP) systems and strategies in manufacturing, and concerns about the migration of HRM processes to such systems.

  6. Aerosol Processing of Crumpled Graphene Oxide-based Nanocomposites for Drug Delivery.

    PubMed

    Wang, Wei-Ning; He, Xiang

    2016-01-01

    The flexibility of graphene oxide (GO) nanosheets and their unique properties enable them to be excellent two dimensional (2D) building blocks for designing functional materials. Aerosol routes are proved to be a rational approach to fold the 2D flat GO nanosheets into 3D crumpled spheres to mitigate the restacking issue for large-scale applications, such as for drug delivery. The fundamentals of graphene, GO, and the crumpling process of GO nanosheets are summarized. Various crumpled graphene oxide (CGO)-based nanocomposites have been synthesized by aerosol routes. This mini review focuses on the state-of-the-art in the design and fabrication of these nanocomposites for a specific application in drug delivery. Various techniques are demonstrated and discussed to control the release rates, tailor the morphology, and adjust the components inside the nanocomposites. Potential risks and possible trends are also pointed out. Aerosol processing of CGO-based nanocomposites provides a promising approach to design functional nanomaterials for drug delivery and other related applications.

  7. The design briefing process matters: a case study on telehealthcare device providers in the UK.

    PubMed

    Yang, Fan; Renda, Gianni

    2018-01-23

    The telehealthcare sector has been expanding steadily in the UK. However, confusing, complex and unwieldy designs of telehealthcare devices are at best, less effective than they could be, at worst, they are potentially dangerous to the users. This study investigated the factors within the new product development process that hindered satisfactory product design outcomes, through working collaboratively with a leading provider based in the UK. This study identified that there are too many costly late-stage design changes; a critical and persistent problem area ripe for improvement. The findings from analyzing 30 recent devices, interviewing key stakeholders and observing on-going projects further revealed that one major cause of the issue was poor practice in defining and communicating the product design criteria and requirements. Addressing the characteristics of the telehealthcare industry, such as multiple design commissioners and frequent deployment of design subcontracts, this paper argues that undertaking a robust process of creating the product design brief is the key to improving the outcomes of telehealthcare device design, particularly for the small and medium-sized enterprises dominating the sector. Implications for rehabilitation Product design criteria and requirements are frequently ill-defined and ineffectively communicated to the designers within the processes of developing new telehealthcare devices. The absence of a (robust) process of creating the design brief is the root cause of the identified issues in defining and communicating the design task. Deploying a formal process of creating the product design brief is particularly important for the telehealthcare sector.

  8. Design and application of process control charting methodologies to gamma irradiation practices

    NASA Astrophysics Data System (ADS)

    Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.

    2002-12-01

    The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.

  9. A factory concept for processing and manufacturing with lunar material

    NASA Technical Reports Server (NTRS)

    Driggers, G. W.

    1977-01-01

    A conceptual design for an orbital factory sized to process 1.5 million metric tons per year of raw lunar fines into 0.3 million metric tons of manufacturing materials is presented. A conservative approach involving application of present earth-based technology leads to a design devoid of new inventions. Earth based counterparts to the factory machinery were used to generate subsystem masses and lumped parameters for volume and mass estimates. The results are considered to be conservative since technologies more advanced than those assumed are presently available in many areas. Some attributes of potential space processing technologies applied to material refinement and component manufacture are discussed.

  10. Design of housing file box of fire academy based on RFID

    NASA Astrophysics Data System (ADS)

    Li, Huaiyi

    2018-04-01

    This paper presents a design scheme of intelligent file box based on RFID. The advantages of RFID file box and traditional file box are compared and analyzed, and the feasibility of RFID file box design is analyzed based on the actual situation of our university. After introducing the shape and structure design of the intelligent file box, the paper discusses the working process of the file box, and explains in detail the internal communication principle of the RFID file box and the realization of the control system. The application of the RFID based file box will greatly improve the efficiency of our school's archives management.

  11. Designing visual displays and system models for safe reactor operations based on the user`s perspective of the system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown-VanHoozer, S.A.

    Most designers are not schooled in the area of human-interaction psychology and therefore tend to rely on the traditional ergonomic aspects of human factors when designing complex human-interactive workstations related to reactor operations. They do not take into account the differences in user information processing behavior and how these behaviors may affect individual and team performance when accessing visual displays or utilizing system models in process and control room areas. Unfortunately, by ignoring the importance of the integration of the user interface at the information process level, the result can be sub-optimization and inherently error- and failure-prone systems. Therefore, tomore » minimize or eliminate failures in human-interactive systems, it is essential that the designers understand how each user`s processing characteristics affects how the user gathers information, and how the user communicates the information to the designer and other users. A different type of approach in achieving this understanding is Neuro Linguistic Programming (NLP). The material presented in this paper is based on two studies involving the design of visual displays, NLP, and the user`s perspective model of a reactor system. The studies involve the methodology known as NLP, and its use in expanding design choices from the user`s ``model of the world,`` in the areas of virtual reality, workstation design, team structure, decision and learning style patterns, safety operations, pattern recognition, and much, much more.« less

  12. Materials, Processes and Manufacturing in Ares 1 Upper Stage: Integration with Systems Design and Development

    NASA Technical Reports Server (NTRS)

    Bhat, Biliyar N.

    2008-01-01

    Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.

  13. Standing wave design and experimental validation of a tandem simulated moving bed process for insulin purification.

    PubMed

    Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda

    2002-01-01

    A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.

  14. ELISA, a demonstrator environment for information systems architecture design

    NASA Technical Reports Server (NTRS)

    Panem, Chantal

    1994-01-01

    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.

  15. ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES

    EPA Science Inventory

    Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...

  16. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  17. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  18. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  19. Evaluation in the Design of Complex Systems

    ERIC Educational Resources Information Center

    Ho, Li-An; Schwen, Thomas M.

    2006-01-01

    We identify literature that argues the process of creating knowledge-based system is often imbalanced. In most knowledge-based systems, development is often technology-driven instead of requirement-driven. Therefore, we argue designers must recognize that evaluation is a critical link in the application of requirement-driven development models…

  20. Design Process Improvement for Electric CAR Harness

    NASA Astrophysics Data System (ADS)

    Sawatdee, Thiwarat; Chutima, Parames

    2017-06-01

    In an automobile parts design company, the customer satisfaction is one of the most important factors for product design. Therefore, the company employs all means to focus its product design process based on the various requirements of customers resulting in high number of design changes. The objective of this research is to improve the design process of the electric car harness that effects the production scheduling by using Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) as the main tools. FTA is employed for root cause analysis and FMEA is used to ranking a High Risk Priority Number (RPN) which is shows the priority of factors in the electric car harness that have high impact to the design of the electric car harness. After the implementation, the improvements are realized significantly since the number of design change is reduced from 0.26% to 0.08%.

  1. The Design of Akhmat Tower

    NASA Astrophysics Data System (ADS)

    Beardsley, Sara; Stochetti, Alejandro; Cerone, Marc

    2018-03-01

    Akhmat Tower is a 435m supertall building designed by Adrian Smith + Gordon Gill Architecture. It is currently under construction in the city of Grozny, in the Chechen Republic, in the North Caucasus region of Russia. The design of the tower was done during a collaborative process by a multi-disciplinary architectural and engineering team, based primarily in the United States and Russia. During this process, the designers considered many factors including, most primarily, the cultural and historical context, the structural requirements given the high seismicity of the region, and the client's programmatic needs. The resulting crystalline-shaped tower is both an aesthetic statement and a performative architectural solution which will be a new landmark for Chechnya. "The Design of Akhmat Tower" describes in detail the design process including structural considerations, exterior wall design, building program, interior design, the tuned mass damper, and the use of building information modeling.

  2. Study on reservoir time-varying design flood of inflow based on Poisson process with time-dependent parameters

    NASA Astrophysics Data System (ADS)

    Li, Jiqing; Huang, Jing; Li, Jianchang

    2018-06-01

    The time-varying design flood can make full use of the measured data, which can provide the reservoir with the basis of both flood control and operation scheduling. This paper adopts peak over threshold method for flood sampling in unit periods and Poisson process with time-dependent parameters model for simulation of reservoirs time-varying design flood. Considering the relationship between the model parameters and hypothesis, this paper presents the over-threshold intensity, the fitting degree of Poisson distribution and the design flood parameters are the time-varying design flood unit period and threshold discriminant basis, deduced Longyangxia reservoir time-varying design flood process at 9 kinds of design frequencies. The time-varying design flood of inflow is closer to the reservoir actual inflow conditions, which can be used to adjust the operating water level in flood season and make plans for resource utilization of flood in the basin.

  3. Gallium-arsenide process evaluation based on a RISC microprocessor example

    NASA Astrophysics Data System (ADS)

    Brown, Richard B.; Upton, Michael; Chandna, Ajay; Huff, Thomas R.; Mudge, Trevor N.; Oettel, Richard E.

    1993-10-01

    This work evaluates the features of a gallium-arsenide E/D MESFET process in which a 32-b RISC microprocessor was implemented. The design methodology and architecture of this prototype CPU are described. The performance sensitivity of the microprocessor and other large circuit blocks to different process parameters is analyzed, and recommendations for future process features, circuit approaches, and layout styles are made. These recommendations are reflected in the design of a second microprocessor using a more advanced process that achieves much higher density and performance.

  4. Predicting Production Costs for Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.; Weston, R. P.

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This paper outlines the development of a process-based cost model in which the physical elements of the vehicle are soared according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this paper is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool.

  5. Micro-electro-mechanical systems (MEMS) and agile lensing-based modules for communications, sensing and signal processing

    NASA Astrophysics Data System (ADS)

    Reza, Syed Azer

    This dissertation proposes the use of the emerging Micro-Electro-Mechanical Systems (MEMS) and agile lensing optical device technologies to design novel and powerful signal conditioning and sensing modules for advanced applications in optical communications, physical parameter sensing and RF/optical signal processing. For example, these new module designs have experimentally demonstrated exceptional features such as stable loss broadband operations and high > 60 dB optical dynamic range signal filtering capabilities. The first part of the dissertation describes the design and demonstration of digital MEMS-based signal processing modules for communication systems and sensor networks using the TI DLP (Digital Light Processing) technology. Examples of such modules include optical power splitters, narrowband and broadband variable fiber optical attenuators, spectral shapers and filters. Compared to prior works, these all-digital designs have advantages of repeatability, accuracy, and reliability that are essential for advanced communications and sensor applications. The next part of the dissertation proposes, analyzes and demonstrates the use of analog opto-fluidic agile lensing technology for sensor networks and test and measurement systems. Novel optical module designs for distance sensing, liquid level sensing, three-dimensional object shape sensing and variable photonic delay lines are presented and experimentally demonstrated. Compared to prior art module designs, the proposed analog-mode modules have exceptional performances, particularly for extreme environments (e.g., caustic liquids) where the free-space agile beam-based sensor provide remote non-contact access for physical sensing operations. The dissertation also presents novel modules involving hybrid analog-digital photonic designs that make use of the different optical device technologies to deliver the best features of both analog and digital optical device operations and controls. Digital controls are achieved through the use of the digital MEMS technology and analog controls are realized by employing opto-fluidic agile lensing technology and acousto-optic technology. For example, variable fiber-optic attenuators and spectral filters are proposed using the hybrid design. Compared to prior art module designs, these hybrid designs provide a higher module dynamic range and increased resolution that are critical in various advanced system applications. In summary, the dissertation shows the added power of hybrid optical designs using both the digital and analog photonic signal processing versus just all-digital or all-analog module designs.

  6. The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Huiling; Fan, Delin; Zhang, Yizhuo

    This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.

  7. A Model-based B2B (Batch to Batch) Control for An Industrial Batch Polymerization Process

    NASA Astrophysics Data System (ADS)

    Ogawa, Morimasa

    This paper describes overview of a model-based B2B (batch to batch) control for an industrial batch polymerization process. In order to control the reaction temperature precisely, several methods based on the rigorous process dynamics model are employed at all design stage of the B2B control, such as modeling and parameter estimation of the reaction kinetics which is one of the important part of the process dynamics model. The designed B2B control consists of the gain scheduled I-PD/II2-PD control (I-PD with double integral control), the feed-forward compensation at the batch start time, and the model adaptation utilizing the results of the last batch operation. Throughout the actual batch operations, the B2B control provides superior control performance compared with that of conventional control methods.

  8. Model based design introduction: modeling game controllers to microprocessor architectures

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  9. A Classification Model and an Open E-Learning System Based on Intuitionistic Fuzzy Sets for Instructional Design Concepts

    ERIC Educational Resources Information Center

    Güyer, Tolga; Aydogdu, Seyhmus

    2016-01-01

    This study suggests a classification model and an e-learning system based on this model for all instructional theories, approaches, models, strategies, methods, and technics being used in the process of instructional design that constitutes a direct or indirect resource for educational technology based on the theory of intuitionistic fuzzy sets…

  10. Design and implementation of an Internet based effective controlling and monitoring system with wireless fieldbus communications technologies for process automation--an experimental study.

    PubMed

    Cetinceviz, Yucel; Bayindir, Ramazan

    2012-05-01

    The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  11. DCL System Using Deep Learning Approaches for Land-based or Ship-based Real-Time Recognition and Localization of Marine Mammals

    DTIC Science & Technology

    2012-09-30

    platform (HPC) was developed, called the HPC-Acoustic Data Accelerator, or HPC-ADA for short. The HPC-ADA was designed based on fielded systems [1-4...software (Detection cLassificaiton for MAchine learning - High Peformance Computing). The software package was designed to utilize parallel and...Sedna [7] and is designed using a parallel architecture2, allowing existing algorithms to distribute to the various processing nodes with minimal changes

  12. The NMCSSC Quick-Reacting General War Gaming System (QUICK). Analytical Manual. Volume I - Data Base Preparation. Change

    DTIC Science & Technology

    1972-02-21

    is a two-sided strategic nuclear exchange war gaming system. It is designed co assist the military planner in examining various facets of strategic...substantial, the data base preparation process is designed to provide an efficient means of assembling, maintaining, and organizing an input data base to... designed to assist in the study of &’trategic conflicts involving a large-scale Pexchange of nuclear weapons. The system is structured into five

  13. Medicinal chemistry inspired fragment-based drug discovery.

    PubMed

    Lanter, James; Zhang, Xuqing; Sui, Zhihua

    2011-01-01

    Lead generation can be a very challenging phase of the drug discovery process. The two principal methods for this stage of research are blind screening and rational design. Among the rational or semirational design approaches, fragment-based drug discovery (FBDD) has emerged as a useful tool for the generation of lead structures. It is particularly powerful as a complement to high-throughput screening approaches when the latter failed to yield viable hits for further development. Engagement of medicinal chemists early in the process can accelerate the progression of FBDD efforts by incorporating drug-friendly properties in the earliest stages of the design process. Medium-chain acyl-CoA synthetase 2b and ketohexokinase are chosen as examples to illustrate the importance of close collaboration of medicinal chemists, crystallography, and modeling. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. DIVA V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CHEN, JOANNA; SIMIRENKO, LISA; TAPASWI, MANJIRI

    The DIVA software interfaces a process in which researchers design their DNA with a web-based graphical user interface, submit their designs to a central queue, and a few weeks later receive their sequence-verified clonal constructs. Each researcher independently designs the DNA to be constructed with a web-based BioCAD tool, and presses a button to submit their designs to a central queue. Researchers have web-based access to their DNA design queues, and can track the progress of their submitted designs as they progress from "evaluation", to "waiting for reagents", to "in progress", to "complete". Researchers access their completed constructs through themore » central DNA repository. Along the way, all DNA construction success/failure rates are captured in a central database. Once a design has been submitted to the queue, a small number of dedicated staff evaluate the design for feasibility and provide feedback to the responsible researcher if the design is either unreasonable (e.g., encompasses a combinatorial library of a billion constructs) or small design changes could significantly facilitate the downstream implementation process. The dedicated staff then use DNA assembly design automation software to optimize the DNA construction process for the design, leveraging existing parts from the DNA repository where possible and ordering synthetic DNA where necessary. SynTrack software manages the physical locations and availability of the various requisite reagents and process inputs (e.g., DNA templates). Once all requisite process inputs are available, the design progresses from "waiting for reagents" to "in progress" in the design queue. Human-readable and machine-parseable DNA construction protocols output by the DNA assembly design automation software are then executed by the dedicated staff exploiting lab automation devices wherever possible. Since the all employed DNA construction methods are sequence-agnostic, standardized (utilize the same enzymatic master mixes and reaction conditions), completely independent DNA construction tasks can be aggregated into the same multi-well plates and pursued in parallel. The resulting sets of cloned constructs can then be screened by high-throughput next-gen sequencing platforms for sequence correctness. A combination of long read-length (e.g., PacBio) and paired-end read platforms (e.g., Illumina) would be exploited depending the particular task at hand (e.g., PacBio might be sufficient to screen a set of pooled constructs with significant gene divergence). Post sequence verification, designs for which at least one correct clone was identified will progress to a "complete" status, while designs for which no correct clones wereidentified will progress to a "failure" status. Depending on the failure mode (e.g., no transformants), and how many prior attempts/variations of assembly protocol have been already made for a given design, subsequent attempts may be made or the design can progress to a "permanent failure" state. All success and failure rate information will be captured during the process, including at which stage a given clonal construction procedure failed (e.g., no PCR product) and what the exact failure was (e.g. assembly piece 2 missing). This success/failure rate data can be leveraged to refine the DNA assembly design process.« less

  15. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  16. Computer Aided Drug Design: Success and Limitations.

    PubMed

    Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho

    2016-01-01

    Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.

  17. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  18. User-Centered Design of Online Learning Communities

    ERIC Educational Resources Information Center

    Lambropoulos, Niki, Ed.; Zaphiris, Panayiotis, Ed.

    2007-01-01

    User-centered design (UCD) is gaining popularity in both the educational and business sectors. This is due to the fact that UCD sheds light on the entire process of analyzing, planning, designing, developing, using, evaluating, and maintaining computer-based learning. "User-Centered Design of Online Learning Communities" explains how…

  19. Teaching Engineering Design Through Paper Rockets

    ERIC Educational Resources Information Center

    Welling, Jonathan; Wright, Geoffrey A.

    2018-01-01

    The paper rocket activity described in this article effectively teaches the engineering design process (EDP) by engaging students in a problem-based learning activity that encourages iterative design. For example, the first rockets the students build typically only fly between 30 and 100 feet. As students test and evaluate their rocket designs,…

  20. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)

  1. Development and Validation of the Social Information Processing Application: A Web-Based Measure of Social Information Processing Patterns in Elementary School-Age Boys

    ERIC Educational Resources Information Center

    Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.

    2011-01-01

    The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…

  2. Business process architectures: overview, comparison and framework

    NASA Astrophysics Data System (ADS)

    Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.

    2016-02-01

    With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.

  3. A Descriptive Study of a Building-Based Team Problem-Solving Process

    ERIC Educational Resources Information Center

    Brewer, Alexander B.

    2010-01-01

    The purpose of this study was to empirically evaluate Building-Based Teams for General Education Intervention or BBT for GEI. BBT for GEI is a team problem-solving process designed to assist schools in conducting research-based interventions in the general education setting. Problem-solving teams are part of general education and provide support…

  4. The Adoption Process of Ricefield-Based Fish Seed Production in Northwest Bangladesh: An Understanding through Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul

    2010-01-01

    Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…

  5. Employing Inquiry-Based Computer Simulations and Embedded Scientist Videos to Teach Challenging Climate Change and Nature of Science Concepts

    ERIC Educational Resources Information Center

    Cohen, Edward Charles

    2013-01-01

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…

  6. Green design assessment of electromechanical products based on group weighted-AHP

    NASA Astrophysics Data System (ADS)

    Guo, Jinwei; Zhou, MengChu; Li, Zhiwu; Xie, Huiguang

    2015-11-01

    Manufacturing industry is the backbone of a country's economy while environmental pollution is a serious problem that human beings must face today. The green design of electromechanical products based on enterprise information systems is an important method to solve the environmental problem. The question on how to design green products must be answered by excellent designers via both advanced design methods and effective assessment methods of electromechanical products. Making an objective and precise assessment of green design is one of the problems that must be solved when green design is conducted. An assessment method of green design on electromechanical products based on Group Weighted-AHP (Analytic Hierarchy Process) is proposed in this paper, together with the characteristics of green products. The assessment steps of green design are also established. The results are illustrated via the assessment of a refrigerator design.

  7. Knowledge-Based Manufacturing and Structural Design for a High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.

    1994-01-01

    The aerospace industry is currently addressing the problem of integrating manufacturing and design. To address the difficulties associated with using many conventional procedural techniques and algorithms, one feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors present their reasons for selecting a KBS to integrate design and manufacturing. A methodology for an aircraft producibility assessment is proposed, utilizing a KBS for manufacturing process selection, that addresses both procedural and heuristic aspects of designing and manufacturing of a High Speed Civil Transport (HSCT) wing. A cost model is discussed that would allow system level trades utilizing information describing the material characteristics as well as the manufacturing process selections. Statements of future work conclude the paper.

  8. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  9. Implementing Target Value Design.

    PubMed

    Alves, Thais da C L; Lichtig, Will; Rybkowski, Zofia K

    2017-04-01

    An alternative to the traditional way of designing projects is the process of target value design (TVD), which takes different departure points to start the design process. The TVD process starts with the client defining an allowable cost that needs to be met by the design and construction teams. An expected cost in the TVD process is defined through multiple interactions between multiple stakeholders who define wishes and others who define ways of achieving these wishes. Finally, a target cost is defined based on the expected profit the design and construction teams are expecting to make. TVD follows a series of continuous improvement efforts aimed at reaching the desired goals for the project and its associated target value cost. The process takes advantage of rapid cycles of suggestions, analyses, and implementation that starts with the definition of value for the client. In the traditional design process, the goal is to identify user preferences and find solutions that meet the needs of the client's expressed preferences. In the lean design process, the goal is to educate users about their values and advocate for a better facility over the long run; this way owners can help contractors and designers to identify better solutions. This article aims to inform the healthcare community about tools and techniques commonly used during the TVD process and how they can be used to educate and support project participants in developing better solutions to meet their needs now as well as in the future.

  10. Process Design and Economics for the Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels. Thermochemical Research Pathways with In Situ and Ex Situ Upgrading of Fast Pyrolysis Vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, Abhijit; Sahir, Asad; Tan, Eric

    This report was developed as part of the U.S. Department of Energy’s Bioenergy Technologies Office’s efforts to enable the development of technologies for the production of infrastructurecompatible, cost-competitive liquid hydrocarbon fuels from biomass. Specifically, this report details two conceptual designs based on projected product yields and quality improvements via catalyst development and process integration. It is expected that these research improvements will be made within the 2022 timeframe. The two conversion pathways detailed are (1) in situ and (2) ex situ upgrading of vapors produced from the fast pyrolysis of biomass. While the base case conceptual designs and underlying assumptionsmore » outline performance metrics for feasibility, it should be noted that these are only two of many other possibilities in this area of research. Other promising process design options emerging from the research will be considered for future techno-economic analysis.« less

  11. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Energy efficiency technologies in cement and steel industry

    NASA Astrophysics Data System (ADS)

    Zanoli, Silvia Maria; Cocchioni, Francesco; Pepe, Crescenzo

    2018-02-01

    In this paper, Advanced Process Control strategies aimed at energy efficiency achievement and improvement in cement and steel industry are proposed. A flexible and smart control structure constituted by several functional modules and blocks has been developed. The designed control strategy is based on Model Predictive Control techniques, formulated on linear models. Two industrial control solutions have been developed, oriented to energy efficiency and process control improvement in cement industry clinker rotary kilns (clinker production phase) and in steel industry billets reheating furnaces. Tailored customization procedures for the design of ad hoc control systems have been executed, based on the specific needs and specifications of the analysed processes. The installation of the developed controllers on cement and steel plants produced significant benefits in terms of process control which resulted in working closer to the imposed operating limits. With respect to the previous control systems, based on local controllers and/or operators manual conduction, more profitable configurations of the crucial process variables have been provided.

  13. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  14. Program Helps Decompose Complex Design Systems

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Hall, Laura E.

    1995-01-01

    DeMAID (Design Manager's Aid for Intelligent Decomposition) computer program is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problems such as large platforms in outer space. Groups modular subsystems on basis of interactions among them. Saves considerable amount of money and time in total design process, particularly in new design problem in which order of modules has not been defined. Originally written for design problems, also applicable to problems containing modules (processes) that take inputs and generate outputs. Available in three machine versions: Macintosh written in Symantec's Think C 3.01, Sun, and SGI IRIS in C language.

  15. Designing simulator-based training: an approach integrating cognitive task analysis and four-component instructional design.

    PubMed

    Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.

  16. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  17. Supporting Interoperability and Context-Awareness in E-Learning through Situation-Driven Learning Processes

    ERIC Educational Resources Information Center

    Dietze, Stefan; Gugliotta, Alessio; Domingue, John

    2009-01-01

    Current E-Learning technologies primarily follow a data and metadata-centric paradigm by providing the learner with composite content containing the learning resources and the learning process description, usually based on specific metadata standards such as ADL SCORM or IMS Learning Design. Due to the design-time binding of learning resources,…

  18. Three-Dimensional Nanobiocomputing Architectures With Neuronal Hypercells

    DTIC Science & Technology

    2007-06-01

    Neumann architectures, and CMOS fabrication. Novel solutions of massive parallel distributed computing and processing (pipelined due to systolic... and processing platforms utilizing molecular hardware within an enabling organization and architecture. The design technology is based on utilizing a...Microsystems and Nanotechnologies investigated a novel 3D3 (Hardware Software Nanotechnology) technology to design super-high performance computing

  19. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    USDA-ARS?s Scientific Manuscript database

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  20. UOE Pipe Manufacturing Process Simulation: Equipment Designing and Construction

    NASA Astrophysics Data System (ADS)

    Delistoian, Dmitri; Chirchor, Mihael

    2017-12-01

    UOE pipe manufacturing process influence directly on pipeline resilience and operation capacity. At present most spreaded pipe manufacturing method is UOE. This method is based on cold forming. After each technological step appears a certain stress and strain level. For pipe stress strain study is designed and constructed special equipment that simulate entire technological process.UOE pipe equipment is dedicated for manufacturing of longitudinally submerged arc welded DN 400 (16 inch) steel pipe.

  1. SUSTAINABLE PLASTICS: DESIGNING AND DEMONSTRATING RENEWABLE, BIODEGRADABLE PRODUCTS MADE OF SOY PROTEIN-BASED PLASTICS

    EPA Science Inventory

    We have found that soy protein plastics have flow properties that are comparable to fossil fuel-based plastics. Soy plastics are processed at much lower temperatures, however, yielding energy savings over synthetic plastics during processing. These comparable flow properties m...

  2. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  3. Urban Planning by Le Corbusier According to Praxeological Knowledge

    NASA Astrophysics Data System (ADS)

    Dzwierzynska, Jolanta; Prokopska, Aleksandra

    2017-12-01

    The city is formed as a mosaic of various elements which affect its attractiveness. These elements range from location attributes, through economic opportunities, to social aspects. Therefore, urbanity and urban planning should be considered in a multi-dimensional context. In the paper we address the problem of urban planning by Le Corbusier according to praxeological and system knowledge. From praxeological point of view an active human being takes his/her choice between various possibilities by preferring one of these possibilities to the others, and by manifesting it by her actions. The same applies to the design process. Due to this fact, the scientific design process can be treated as a systematic rational reconstruction of the designer’s behaviour. Such a reconstruction requires previous reflection on designer’s work, as well as some consideration and design experience, thus know-how knowledge based on methodological knowledge. In the paper several city visions of Le Corbusier, as well as the characteristics and organisation of his design process are analysed. Le Corbusier’s innovative design ideas resulted from industrialisation changes and motorisation accelerating progress, which gave foundation to a new urban array. This array based on strict geometric forms, regularity and repetition determining standard. Thanks to his theories, Le Corbusier established principles of modern city construction and planning. Although some doubts were expressed as to the scale of centralisation of the cities designed by him and his class-based conception, he was awarded that overall welfare of the individual living in a city was the quality of built environment. Therefore, his designed creations were not only functional but they also produced emotions. The analysis of his prolific design activities allows us to state that the organisation of his architectural and urban planning process was very efficient and complex. The city concepts proposed by him were the subject of analysis by generations of designers. Also now, they can still be the basis for modelling virtual and navigable cities by modern planners. Le Corbusier’s comprehensive approach to modern city planning showed that research activities, that is theoretical thinking, and production activities, that is practice, are linked methodically. Therefore, urban planning should be understood not only as a projection of the possibilities of architecture, but as a multidisciplinary process. Due to this fact, an urban plan, as a result of that process, should be a synthesis of various social, industrial and economic aspects.

  4. Innovation and design of a web-based pain education interprofessional resource

    PubMed Central

    Lax, Leila; Watt-Watson, Judy; Lui, Michelle; Dubrowski, Adam; McGillion, Michael; Hunter, Judith; MacLennan, Cameron; Knickle, Kerry; Robb, Anja; Lapeyre, Jaime

    2011-01-01

    INTRODUCTION: The present article describes educational innovation processes and design of a web-based pain interprofessional resource for prelicensure health science students in universities across Canada. Operationalization of educational theory in design coupled with formative evaluation of design are discussed, along with strategies that support collaborative innovation. METHODS: Educational design was driven by content, theory and evaluation. Pain misbeliefs and teaching points along the continuum from acute to persistent pain were identified. Knowledge-building theory, situated learning, reflection and novel designs for cognitive scaffolding were then employed. Design research principles were incorporated to inform iterative and ongoing design. RESULTS: An authentic patient case was constructed, situated in inter-professional complex care to highlight learning objectives related to pre-operative, postoperative and treatment up to one year, for a surgical cancer patient. Pain mechanisms, assessment and management framed content creation. Knowledge building scaffolds were used, which included video simulations, embedded resources, concurrent feedback, practice-based reflective exercises and commentaries. Scaffolds were refined to specifically support knowledge translation. Illustrative commentaries were designed to explicate pain misbeliefs and best practices. Architecture of the resource was mapped; a multimedia, interactive prototype was created. This pain education resource was developed primarily for individual use, with extensions for interprofessional collective discourse. DISCUSSION: Translation of curricular content scripts into representation maps supported the collaborative design process by establishing a common visual language. The web-based prototype will be formatively and summatively evaluated to assess pedagogic design, knowledge-translation scaffolds, pain knowledge gains, relevance, feasibility and fidelity of this educational innovation. PMID:22184552

  5. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  6. Three-Dimensional Computational Fluid Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haworth, D.C.; O'Rourke, P.J.; Ranganathan, R.

    1998-09-01

    Computational fluid dynamics (CFD) is one discipline falling under the broad heading of computer-aided engineering (CAE). CAE, together with computer-aided design (CAD) and computer-aided manufacturing (CAM), comprise a mathematical-based approach to engineering product and process design, analysis and fabrication. In this overview of CFD for the design engineer, our purposes are three-fold: (1) to define the scope of CFD and motivate its utility for engineering, (2) to provide a basic technical foundation for CFD, and (3) to convey how CFD is incorporated into engineering product and process design.

  7. Spaceborne computer executive routine functional design specification. Volume 1: Functional design of a flight computer executive program for the reusable shuttle

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1971-01-01

    A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.

  8. Evolutionary and biological metaphors for engineering design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakiela, M.

    1994-12-31

    Since computing became generally available, there has been strong interest in using computers to assist and automate engineering design processes. Specifically, for design optimization and automation, nonlinear programming and artificial intelligence techniques have been extensively studied. New computational techniques, based upon the natural processes of evolution, adaptation, and learing, are showing promise because of their generality and robustness. This presentation will describe the use of two such techniques, genetic algorithms and classifier systems, for a variety of engineering design problems. Structural topology optimization, meshing, and general engineering optimization are shown as example applications.

  9. Tribology symposium -- 1994. PD-Volume 61

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masudi, H.

    This year marks the first Tribology Symposium within the Energy-Sources Technology Conference, sponsored by the ASME Petroleum Division. The program was divided into five sessions: Tribology in High Technology, a historical discussion of some watershed events in tribology; Research/Development, design, research and development on modern manufacturing; Tribology in Manufacturing, the impact of tribology on modern manufacturing; Design/Design Representation, aspects of design related to tribological systems; and Failure Analysis, an analysis of failure, failure detection, and failure monitoring as relating to manufacturing processes. Eleven papers have been processed separately for inclusion on the data base.

  10. Design and optimization of a fiber optic data link for new generation on-board SAR processing architectures

    NASA Astrophysics Data System (ADS)

    Ciminelli, Caterina; Dell'Olio, Francesco; Armenise, Mario N.; Iacomacci, Francesco; Pasquali, Franca; Formaro, Roberto

    2017-11-01

    A fiber optic digital link for on-board data handling is modeled, designed and optimized in this paper. Design requirements and constraints relevant to the link, which is in the frame of novel on-board processing architectures, are discussed. Two possible link configurations are investigated, showing their advantages and disadvantages. An accurate mathematical model of each link component and the entire system is reported and results of link simulation based on those models are presented. Finally, some details on the optimized design are provided.

  11. A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan

    2018-04-01

    This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.

  12. Transforming paper-based assessment forms to a digital format: Exemplified by the Housing Enabler prototype app.

    PubMed

    Svarre, Tanja; Lunn, Tine Bieber Kirkegaard; Helle, Tina

    2017-11-01

    The aim of this paper is to provide the reader with an overall impression of the stepwise user-centred design approach including the specific methods used and lessons learned when transforming paper-based assessment forms into a prototype app, taking the Housing Enabler as an example. Four design iterations were performed, building on a domain study, workshops, expert evaluation and controlled and realistic usability tests. The user-centred design process involved purposefully selected participants with different Housing Enabler knowledge and housing adaptation experience. The design iterations resulted in the development of a Housing Enabler prototype app. The prototype app has several features and options that are new compared with the original paper-based Housing Enabler assessment form. These new features include a user friendly overview of the assessment form; easy navigation by swiping back and forth between items; onsite data analysis; and ranking of the accessibility score, photo documentation and a data export facility. Based on the presented stepwise approach, a high-fidelity Housing Enabler prototype app was successfully developed. The development process has emphasized the importance of combining design participants' knowledge and experiences, and has shown that methods should seem relevant to participants to increase their engagement.

  13. A cultural historical theoretical perspective of discourse and design in the science classroom

    NASA Astrophysics Data System (ADS)

    Adams, Megan

    2015-06-01

    Flavio Azevedo, Peggy Martalock and Tugba Keser have initiated an important conversation in science education as they use sociocultural theory to introduce design based scenarios into the science classroom. This response seeks to expand Azevedo, Martalock and Keser's article The discourse of design- based science classroom activities by using a specific perspective within a sociocultural framework. Through using a cultural historical (Vygotsky in The history and development of higher mental functions, Plenum Press, New York, 1987) reading of design based activity and discourse in the science classroom, it is proposed that learning should be an integral part of these processes. Therefore, everyday and scientific concepts are explained and expanded in relation to Inventing Graphing and discourse presented in Azevedo, Martalock and Keser's article. This response reports on the importance of teacher's being explicit in relation to connecting everyday and scientific concepts alongside design based activity and related science concepts when teaching students. It is argued that explicit teaching of concepts should be instigated prior to analysis of discourse in the science classroom as it is only with experience and understanding these processes that students have the resources to call upon to argue like practicing scientists.

  14. Stochastic model for fatigue crack size and cost effective design decisions. [for aerospace structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1975-01-01

    This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.

  15. The development of a specialized processor for a space-based multispectral earth imager

    NASA Astrophysics Data System (ADS)

    Khedr, Mostafa E.

    2008-10-01

    This work was done in the Department of Computer Engineering, Lvov Polytechnic National University, Lvov, Ukraine, as a thesis entitled "Space Imager Computer System for Raw Video Data Processing" [1]. This work describes the synthesis and practical implementation of a specialized computer system for raw data control and processing onboard a satellite MultiSpectral earth imager. This computer system is intended for satellites with resolution in the range of one meter with 12-bit precession. The design is based mostly on general off-the-shelf components such as (FPGAs) plus custom designed software for interfacing with PC and test equipment. The designed system was successfully manufactured and now fully functioning in orbit.

  16. Design architecture of double spiral interdigitated electrode with back gate electrode for biosensor application

    NASA Astrophysics Data System (ADS)

    Fathil, M. F. M.; Arshad, M. K. Md.; Hashim, U.; Ruslinda, A. R.; Gopinath, Subash C. B.; M. Nuzaihan M., N.; Ayub, R. M.; Adzhri, R.; Zaki, M.; Azman, A. H.

    2016-07-01

    This paper presents the preparation method of photolithography chrome mask design used in fabrication process of double spiral interdigitated electrode with back gate biasing based biosensor. By learning the fabrication process flow of the biosensor, the chrome masks are designed through drawing using the AutoCAD software. The overall width and length of the device is optimized at 7.0 mm and 10.0 mm, respectively. Fabrication processes of the biosensor required three chrome masks, which included back gate opening, spiral IDE formation, and passivation area formation. The complete chrome masks design will be sent for chrome mask fabrication and for future use in biosensor fabrication.

  17. An Interactive Preliminary Design System of High Speed Forebody and Inlet Flows

    NASA Technical Reports Server (NTRS)

    Liou, May-Fun; Benson, Thomas J.; Trefny, Charles J.

    2010-01-01

    This paper demonstrates a simulation-based aerodynamic design process of high speed inlet. A genetic algorithm is integrated into the design process to facilitate the single objective optimization. The objective function is the total pressure recovery and is obtained by using a PNS solver for its computing efficiency. The system developed uses existing software of geometry definition, mesh generation and CFD analysis. The process which produces increasingly desirable design in each genetic evolution over many generations is automatically carried out. A generic two-dimensional inlet is created as a showcase to demonstrate the capabilities of this tool. A parameterized study of geometric shape and size of the showcase is also presented.

  18. Designers as Teachers and Learners: Transferring Workplace Design Practice into Educational Settings

    ERIC Educational Resources Information Center

    Mawson, B.

    2007-01-01

    The nature of the design process and how to develop this skill in novice designers has been of considerable interest to technology educators. The relationship between workplace and school-based design is one area in which a need for further research has been identified by Hill and Anning (2001, "International Journal of Technology and Design…

  19. Rectenna System Design. [energy conversion solar power satellites

    NASA Technical Reports Server (NTRS)

    Woodcock, G. R.; Andryczyk, R. W.

    1980-01-01

    The fundamental processes involved in the operation of the rectenna system designed for the solar power satellite system are described. The basic design choices are presented based on the desired microwave rf field concentration prior to rectification and based on the ground clearance requirements for the rectenna structure. A nonconcentrating inclined planar panel with a 2 meter minimum clearance configuration is selected as a representative of the typical rectenna.

  20. Comparing problem-based learning and lecture as methods to teach whole-systems design to engineering students

    NASA Astrophysics Data System (ADS)

    Dukes, Michael Dickey

    The objective of this research is to compare problem-based learning and lecture as methods to teach whole-systems design to engineering students. A case study, Appendix A, exemplifying successful whole-systems design was developed and written by the author in partnership with the Rocky Mountain Institute. Concepts to be tested were then determined, and a questionnaire was developed to test students' preconceptions. A control group of students was taught using traditional lecture methods, and a sample group of students was taught using problem-based learning methods. After several weeks, the students were given the same questionnaire as prior to the instruction, and the data was analyzed to determine if the teaching methods were effective in correcting misconceptions. A statistically significant change in the students' preconceptions was observed in both groups on the topic of cost related to the design process. There was no statistically significant change in the students' preconceptions concerning the design process, technical ability within five years, and the possibility of drastic efficiency gains with current technologies. However, the results were inconclusive in determining that problem-based learning is more effective than lecture as a method for teaching the concept of whole-systems design, or vice versa.

  1. CFD-based optimization in plastics extrusion

    NASA Astrophysics Data System (ADS)

    Eusterholz, Sebastian; Elgeti, Stefanie

    2018-05-01

    This paper presents novel ideas in numerical design of mixing elements in single-screw extruders. The actual design process is reformulated as a shape optimization problem, given some functional, but possibly inefficient initial design. Thereby automatic optimization can be incorporated and the design process is advanced, beyond the simulation-supported, but still experience-based approach. This paper proposes concepts to extend a method which has been developed and validated for die design to the design of mixing-elements. For simplicity, it focuses on single-phase flows only. The developed method conducts forward-simulations to predict the quasi-steady melt behavior in the relevant part of the extruder. The result of each simulation is used in a black-box optimization procedure based on an efficient low-order parameterization of the geometry. To minimize user interaction, an objective function is formulated that quantifies the products' quality based on the forward simulation. This paper covers two aspects: (1) It reviews the set-up of the optimization framework as discussed in [1], and (2) it details the necessary extensions for the optimization of mixing elements in single-screw extruders. It concludes with a presentation of first advances in the unsteady flow simulation of a metering and mixing section with the SSMUM [2] using the Carreau material model.

  2. Distributed Computer Networks in Support of Complex Group Practices

    PubMed Central

    Wess, Bernard P.

    1978-01-01

    The economics of medical computer networks are presented in context with the patient care and administrative goals of medical networks. Design alternatives and network topologies are discussed with an emphasis on medical network design requirements in distributed data base design, telecommunications, satellite systems, and software engineering. The success of the medical computer networking technology is predicated on the ability of medical and data processing professionals to design comprehensive, efficient, and virtually impenetrable security systems to protect data bases, network access and services, and patient confidentiality.

  3. Design of an embedded inverse-feedforward biomolecular tracking controller for enzymatic reaction processes.

    PubMed

    Foo, Mathias; Kim, Jongrae; Sawlekar, Rucha; Bates, Declan G

    2017-04-06

    Feedback control is widely used in chemical engineering to improve the performance and robustness of chemical processes. Feedback controllers require a 'subtractor' that is able to compute the error between the process output and the reference signal. In the case of embedded biomolecular control circuits, subtractors designed using standard chemical reaction network theory can only realise one-sided subtraction, rendering standard controller design approaches inadequate. Here, we show how a biomolecular controller that allows tracking of required changes in the outputs of enzymatic reaction processes can be designed and implemented within the framework of chemical reaction network theory. The controller architecture employs an inversion-based feedforward controller that compensates for the limitations of the one-sided subtractor that generates the error signals for a feedback controller. The proposed approach requires significantly fewer chemical reactions to implement than alternative designs, and should have wide applicability throughout the fields of synthetic biology and biological engineering.

  4. CHAM: weak signals detection through a new multivariate algorithm for process control

    NASA Astrophysics Data System (ADS)

    Bergeret, François; Soual, Carole; Le Gratiet, B.

    2016-10-01

    Derivatives technologies based on core CMOS processes are significantly aggressive in term of design rules and process control requirements. Process control plan is a derived from Process Assumption (PA) calculations which result in a design rule based on known process variability capabilities, taking into account enough margin to be safe not only for yield but especially for reliability. Even though process assumptions are calculated with a 4 sigma known process capability margin, efficient and competitive designs are challenging the process especially for derivatives technologies in 40 and 28nm nodes. For wafer fab process control, PA are declined in monovariate (layer1 CD, layer2 CD, layer2 to layer1 overlay, layer3 CD etc….) control charts with appropriated specifications and control limits which all together are securing the silicon. This is so far working fine but such system is not really sensitive to weak signals coming from interactions of multiple key parameters (high layer2 CD combined with high layer3 CD as an example). CHAM is a software using an advanced statistical algorithm specifically designed to detect small signals, especially when there are many parameters to control and when the parameters can interact to create yield issues. In this presentation we will first present the CHAM algorithm, then the case-study on critical dimensions, with the results, and we will conclude on future work. This partnership between Ippon and STM is part of E450LMDAP, European project dedicated to metrology and lithography development for future technology nodes, especially 10nm.

  5. Transforming nanomedicine manufacturing toward Quality by Design and microfluidics.

    PubMed

    Colombo, Stefano; Beck-Broichsitter, Moritz; Bøtker, Johan Peter; Malmsten, Martin; Rantanen, Jukka; Bohr, Adam

    2018-04-05

    Nanopharmaceuticals aim at translating the unique features of nano-scale materials into therapeutic products and consequently their development relies critically on the progression in manufacturing technology to allow scalable processes complying with process economy and quality assurance. The relatively high failure rate in translational nanopharmaceutical research and development, with respect to new products on the market, is at least partly due to immature bottom-up manufacturing development and resulting sub-optimal control of quality attributes in nanopharmaceuticals. Recently, quality-oriented manufacturing of pharmaceuticals has undergone an unprecedented change toward process and product development interaction. In this context, Quality by Design (QbD) aims to integrate product and process development resulting in an increased number of product applications to regulatory agencies and stronger proprietary defense strategies of process-based products. Although QbD can be applied to essentially any production approach, microfluidic production offers particular opportunities for QbD-based manufacturing of nanopharmaceuticals. Microfluidics provides unique design flexibility, process control and parameter predictability, and also offers ample opportunities for modular production setups, allowing process feedback for continuously operating production and process control. The present review aims at outlining emerging opportunities in the synergistic implementation of QbD strategies and microfluidic production in contemporary development and manufacturing of nanopharmaceuticals. In doing so, aspects of design and development, but also technology management, are reviewed, as is the strategic role of these tools for aligning nanopharmaceutical innovation, development, and advanced industrialization in the broader pharmaceutical field. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Harnessing the Big Data Paradigm for ICME: Shifting from Materials Selection to Materials Enabled Design

    NASA Astrophysics Data System (ADS)

    Broderick, Scott R.; Santhanam, Ganesh Ram; Rajan, Krishna

    2016-08-01

    As the size of databases has significantly increased, whether through high throughput computation or through informatics-based modeling, the challenge of selecting the optimal material for specific design requirements has also arisen. Given the multiple, and often conflicting, design requirements, this selection process is not as trivial as sorting the database for a given property value. We suggest that the materials selection process should minimize selector bias, as well as take data uncertainty into account. For this reason, we discuss and apply decision theory for identifying chemical additions to Ni-base alloys. We demonstrate and compare results for both a computational array of chemistries and standard commercial superalloys. We demonstrate how we can use decision theory to select the best chemical additions for enhancing both property and processing, which would not otherwise be easily identifiable. This work is one of the first examples of introducing the mathematical framework of set theory and decision analysis into the domain of the materials selection process.

  7. Multiprocessor graphics computation and display using transputers

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1988-01-01

    A package of two-dimensional graphics routines was developed to run on a transputer-based parallel processing system. These routines were designed to enable applications programmers to easily generate and display results from the transputer network in a graphic format. The graphics procedures were designed for the lowest possible network communication overhead for increased performance. The routines were designed for ease of use and to present an intuitive approach to generating graphics on the transputer parallel processing system.

  8. Knowledge Assisted Integrated Design of a Component and Its Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Gautham, B. P.; Kulkarni, Nagesh; Khan, Danish; Zagade, Pramod; Reddy, Sreedhar; Uppaluri, Rohith

    Integrated design of a product and its manufacturing processes would significantly reduce the total cost of the products as well as the cost of its development. However this would only be possible if we have a platform that allows us to link together simulations tools used for product design, performance evaluation and its manufacturing processes in a closed loop. In addition to that having a comprehensive knowledgebase that provides systematic knowledge guided assistance to product or process designers who may not possess in-depth design knowledge or in-depth knowledge of the simulation tools, would significantly speed up the end-to-end design process. In this paper, we propose a process and illustrate a case for achieving an integrated product and manufacturing process design assisted by knowledge support for the user to make decisions at various stages. We take transmission component design as an example. The example illustrates the design of a gear for its geometry, material selection and its manufacturing processes, particularly, carburizing-quenching and tempering, and feeding the material properties predicted during heat treatment into performance estimation in a closed loop. It also identifies and illustrates various decision stages in the integrated life cycle and discusses the use of knowledge engineering tools such as rule-based guidance, to assist the designer make informed decisions. Simulation tools developed on various commercial, open-source platforms as well as in-house tools along with knowledge engineering tools are linked to build a framework with appropriate navigation through user-friendly interfaces. This is illustrated through examples in this paper.

  9. Optimization of Surfactant Mixtures and Their Interfacial Behavior for Advanced Oil Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somasundaran, Prof. P.

    2002-03-04

    The objective of this project was to develop a knowledge base that is helpful for the design of improved processes for mobilizing and producing oil left untapped using conventional techniques. The main goal was to develop and evaluate mixtures of new or modified surfactants for improved oil recovery. In this regard, interfacial properties of novel biodegradable n-alkyl pyrrolidones and sugar-based surfactants have been studied systematically. Emphasis was on designing cost-effective processes compatible with existing conditions and operations in addition to ensuring minimal reagent loss.

  10. [Discussion on research thinking of traditional Chinese medicine standardization system based on whole process quality control].

    PubMed

    Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun

    2017-12-01

    The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.

  11. Conceptual design for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Gratzer, Louis B.

    1989-01-01

    The designers of aircraft and more recently, aerospace vehicles have always struggled with the problems of evolving their designs to produce a machine which would perform its assigned task(s) in some optimum fashion. Almost invariably this involved dealing with more variables and constraints than could be handled in any computationally feasible way. With the advent of the electronic digital computer, the possibilities for introducing more variable and constraints into the initial design process led to greater expectations for improvement in vehicle (system) efficiency. The creation of the large scale systems necessary to achieve optimum designs has, for many reason, proved to be difficult. From a technical standpoint, significant problems arise in the development of satisfactory algorithms for processing of data from the various technical disciplines in a way that would be compatible with the complex optimization function. Also, the creation of effective optimization routines for multi-variable and constraint situations which could lead to consistent results has lagged. The current capability for carrying out the conceptual design of an aircraft on an interdisciplinary bases was evaluated to determine the need for extending this capability, and if necessary, to recommend means by which this could be carried out. Based on a review of available documentation and individual consultations, it appears that there is extensive interest at Langley Research Center as well as in the aerospace community in providing a higher level of capability that meets the technical challenges. By implication, the current design capability is inadequate and it does not operate in a way that allows the various technical disciplines to participate and cooperately interact in the design process. Based on this assessment, it was concluded that substantial effort should be devoted to developing a computer-based conceptual design system that would provide the capability needed for the near-term as well as framework for development of more advanced methods to serve future needs.

  12. Complete all-optical processing polarization-based binary logic gates and optical processors.

    PubMed

    Zaghloul, Y A; Zaghloul, A R M

    2006-10-16

    We present a complete all-optical-processing polarization-based binary-logic system, by which any logic gate or processor can be implemented. Following the new polarization-based logic presented in [Opt. Express 14, 7253 (2006)], we develop a new parallel processing technique that allows for the creation of all-optical-processing gates that produce a unique output either logic 1 or 0 only once in a truth table, and those that do not. This representation allows for the implementation of simple unforced OR, AND, XOR, XNOR, inverter, and more importantly NAND and NOR gates that can be used independently to represent any Boolean expression or function. In addition, the concept of a generalized gate is presented which opens the door for reconfigurable optical processors and programmable optical logic gates. Furthermore, the new design is completely compatible with the old one presented in [Opt. Express 14, 7253 (2006)], and with current semiconductor based devices. The gates can be cascaded, where the information is always on the laser beam. The polarization of the beam, and not its intensity, carries the information. The new methodology allows for the creation of multiple-input-multiple-output processors that implement, by itself, any Boolean function, such as specialized or non-specialized microprocessors. Three all-optical architectures are presented: orthoparallel optical logic architecture for all known and unknown binary gates, singlebranch architecture for only XOR and XNOR gates, and the railroad (RR) architecture for polarization optical processors (POP). All the control inputs are applied simultaneously leading to a single time lag which leads to a very-fast and glitch-immune POP. A simple and easy-to-follow step-by-step algorithm is provided for the POP, and design reduction methodologies are briefly discussed. The algorithm lends itself systematically to software programming and computer-assisted design. As examples, designs of all binary gates, multiple-input gates, and sequential and non-sequential Boolean expressions are presented and discussed. The operation of each design is simply understood by a bullet train traveling at the speed of light on a railroad system preconditioned by the crossover states predetermined by the control inputs. The presented designs allow for optical processing of the information eliminating the need to convert it, back and forth, to an electronic signal for processing purposes. All gates with a truth table, including for example Fredkin, Toffoli, testable reversible logic, and threshold logic gates, can be designed and implemented using the railroad architecture. That includes any future gates not known today. Those designs and the quantum gates are not discussed in this paper.

  13. Process Design and Economics for the Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels: Thermochemical Research Pathways with In Situ and Ex Situ Upgrading of Fast Pyrolysis Vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, Abhijit; Sahir, A. H.; Tan, Eric

    This report was developed as part of the U.S. Department of Energy’s Bioenergy Technologies Office’s efforts to enable the development of technologies for the production of infrastructure-compatible, cost-competitive liquid hydrocarbon fuels from biomass. Specifically, this report details two conceptual designs based on projected product yields and quality improvements via catalyst development and process integration. It is expected that these research improvements will be made within the 2022 timeframe. The two conversion pathways detailed are (1) in situ and (2) ex situ upgrading of vapors produced from the fast pyrolysis of biomass. While the base case conceptual designs and underlying assumptionsmore » outline performance metrics for feasibility, it should be noted that these are only two of many other possibilities in this area of research. Other promising process design options emerging from the research will be considered for future techno-economic analysis. Both the in situ and ex situ conceptual designs, using the underlying assumptions, project MFSPs of approximately $3.5/gallon gasoline equivalent (GGE). The performance assumptions for the ex situ process were more aggressive with higher distillate (diesel-range) products. This was based on an assumption that more favorable reaction chemistry (such as coupling) can be made possible in a separate reactor where, unlike in an in situ upgrading reactor, one does not have to deal with catalyst mixing with biomass char and ash, which pose challenges to catalyst performance and maintenance. Natural gas was used for hydrogen production, but only when off gases from the process was not sufficient to meet the needs; natural gas consumption is insignificant in both the in situ and ex situ base cases. Heat produced from the burning of char, coke, and off-gases allows for the production of surplus electricity which is sold to the grid allowing a reduction of approximately 5¢/GGE in the MFSP.« less

  14. Internet MEMS design tools based on component technology

    NASA Astrophysics Data System (ADS)

    Brueck, Rainer; Schumer, Christian

    1999-03-01

    The micro electromechanical systems (MEMS) industry in Europe is characterized by small and medium sized enterprises specialized on products to solve problems in specific domains like medicine, automotive sensor technology, etc. In this field of business the technology driven design approach known from micro electronics is not appropriate. Instead each design problem aims at its own, specific technology to be used for the solution. The variety of technologies at hand, like Si-surface, Si-bulk, LIGA, laser, precision engineering requires a huge set of different design tools to be available. No single SME can afford to hold licenses for all these tools. This calls for a new and flexible way of designing, implementing and distributing design software. The Internet provides a flexible manner of offering software access along with methodologies of flexible licensing e.g. on a pay-per-use basis. New communication technologies like ADSL, TV cable of satellites as carriers promise to offer a bandwidth sufficient even for interactive tools with graphical interfaces in the near future. INTERLIDO is an experimental tool suite for process specification and layout verification for lithography based MEMS technologies to be accessed via the Internet. The first version provides a Java implementation even including a graphical editor for process specification. Currently, a new version is brought into operation that is based on JavaBeans component technology. JavaBeans offers the possibility to realize independent interactive design assistants, like a design rule checking assistants, a process consistency checking assistants, a technology definition assistants, a graphical editor assistants, etc. that may reside distributed over the Internet, communicating via Internet protocols. Each potential user thus is able to configure his own dedicated version of a design tool set dedicated to the requirements of the current problem to be solved.

  15. Prediction of Cutting Force in Turning Process-an Experimental Approach

    NASA Astrophysics Data System (ADS)

    Thangarasu, S. K.; Shankar, S.; Thomas, A. Tony; Sridhar, G.

    2018-02-01

    This Paper deals with a prediction of Cutting forces in a turning process. The turning process with advanced cutting tool has a several advantages over grinding such as short cycle time, process flexibility, compatible surface roughness, high material removal rate and less environment problems without the use of cutting fluid. In this a full bridge dynamometer has been used to measure the cutting forces over mild steel work piece and cemented carbide insert tool for different combination of cutting speed, feed rate and depth of cut. The experiments are planned based on taguchi design and measured cutting forces were compared with the predicted forces in order to validate the feasibility of the proposed design. The percentage contribution of each process parameter had been analyzed using Analysis of Variance (ANOVA). Both the experimental results taken from the lathe tool dynamometer and the designed full bridge dynamometer were analyzed using Taguchi design of experiment and Analysis of Variance.

  16. A Four-Stage Model for Planning Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Morrison, Gary R.; Ross, Steven M.

    1988-01-01

    Describes a flexible planning process for developing computer based instruction (CBI) in which the CBI design is implemented on paper between the lesson design and the program production. A four-stage model is explained, including (1) an initial flowchart, (2) storyboards, (3) a detailed flowchart, and (4) an evaluation. (16 references)…

  17. Design Process of a Goal-Based Scenario on Computing Fundamentals

    ERIC Educational Resources Information Center

    Beriswill, Joanne Elizabeth

    2014-01-01

    In this design case, an instructor developed a goal-based scenario (GBS) for undergraduate computer fundamentals students to apply their knowledge of computer equipment and software. The GBS, entitled the MegaTech Project, presented the students with descriptions of the everyday activities of four persons needing to purchase a computer system. The…

  18. Assessing Higher-Order Cognitive Constructs by Using an Information-Processing Framework

    ERIC Educational Resources Information Center

    Dickison, Philip; Luo, Xiao; Kim, Doyoung; Woo, Ada; Muntean, William; Bergstrom, Betty

    2016-01-01

    Designing a theory-based assessment with sound psychometric qualities to measure a higher-order cognitive construct is a highly desired yet challenging task for many practitioners. This paper proposes a framework for designing a theory-based assessment to measure a higher-order cognitive construct. This framework results in a modularized yet…

  19. Bridging Professional Teacher Knowledge for Science and Literary Integration via Design-Based Research

    ERIC Educational Resources Information Center

    Fazio, Xavier; Gallagher, Tiffany L.

    2018-01-01

    We offer insights for using design-based research (DBR) as a model for constructing professional development that supports curriculum and instructional knowledge regarding science and literacy integration. We spotlight experiences in the DBR process from data collected from a sample of four elementary teachers. Findings from interviews, focus…

  20. An Evidence-Based Practice Model across the Academic and Clinical Settings

    ERIC Educational Resources Information Center

    Wolter, Julie A.; Corbin-Lewis, Kim; Self, Trisha; Elsweiler, Anne

    2011-01-01

    This tutorial is designed to provide academic communication sciences and disorders (CSD) programs, at both the undergraduate and graduate levels, with a comprehensive instructional model on evidence-based practice (EBP). The model was designed to help students view EBP as an ongoing process needed in all clinical decision making. The three facets…

Top