The expected results include an integrated process and mechanical design including a fabrication plan for the glycerol dehydration reactor, comprehensive heat and material balance, environmental impact assessment and comprehensive safety review. The resulting process design w...
Process and assembly plans for low cost commercial fuselage structure
NASA Technical Reports Server (NTRS)
Willden, Kurtis; Metschan, Stephen; Starkey, Val
1991-01-01
Cost and weight reduction for a composite structure is a result of selecting design concepts that can be built using efficient low cost manufacturing and assembly processes. Since design and manufacturing are inherently cost dependent, concurrent engineering in the form of a Design-Build Team (DBT) is essential for low cost designs. Detailed cost analysis from DBT designs and hardware verification must be performed to identify the cost drivers and relationships between design and manufacturing processes. Results from the global evaluation are used to quantitatively rank design, identify cost centers for higher ranking design concepts, define and prioritize a list of technical/economic issues and barriers, and identify parameters that control concept response. These results are then used for final design optimization.
The aerospace plane design challenge: Credible computational fluid dynamics results
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B.
1990-01-01
Computational fluid dynamics (CFD) is necessary in the design processes of all current aerospace plane programs. Single-stage-to-orbit (STTO) aerospace planes with air-breathing supersonic combustion are going to be largely designed by means of CFD. The challenge of the aerospace plane design is to provide credible CFD results to work from, to assess the risk associated with the use of those results, and to certify CFD codes that produce credible results. To establish the credibility of CFD results used in design, the following topics are discussed: CFD validation vis-a-vis measurable fluid dynamics (MFD) validation; responsibility for credibility; credibility requirement; and a guide for establishing credibility. Quantification of CFD uncertainties helps to assess success risk and safety risks, and the development of CFD as a design tool requires code certification. This challenge is managed by designing the designers to use CFD effectively, by ensuring quality control, and by balancing the design process. For designing the designers, the following topics are discussed: how CFD design technology is developed; the reasons Japanese companies, by and large, produce goods of higher quality than the U.S. counterparts; teamwork as a new way of doing business; and how ideas, quality, and teaming can be brought together. Quality control for reducing the loss imparted to the society begins with the quality of the CFD results used in the design process, and balancing the design process means using a judicious balance of CFD and MFD.
Towards automatic planning for manufacturing generative processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
CALTON,TERRI L.
2000-05-24
Generative process planning describes methods process engineers use to modify manufacturing/process plans after designs are complete. A completed design may be the result from the introduction of a new product based on an old design, an assembly upgrade, or modified product designs used for a family of similar products. An engineer designs an assembly and then creates plans capturing manufacturing processes, including assembly sequences, component joining methods, part costs, labor costs, etc. When new products originate as a result of an upgrade, component geometry may change, and/or additional components and subassemblies may be added to or are omitted from themore » original design. As a result process engineers are forced to create new plans. This is further complicated by the fact that the process engineer is forced to manually generate these plans for each product upgrade. To generate new assembly plans for product upgrades, engineers must manually re-specify the manufacturing plan selection criteria and re-run the planners. To remedy this problem, special-purpose assembly planning algorithms have been developed to automatically recognize design modifications and automatically apply previously defined manufacturing plan selection criteria and constraints.« less
Structural Optimization in automotive design
NASA Technical Reports Server (NTRS)
Bennett, J. A.; Botkin, M. E.
1984-01-01
Although mathematical structural optimization has been an active research area for twenty years, there has been relatively little penetration into the design process. Experience indicates that often this is due to the traditional layout-analysis design process. In many cases, optimization efforts have been outgrowths of analysis groups which are themselves appendages to the traditional design process. As a result, optimization is often introduced into the design process too late to have a significant effect because many potential design variables have already been fixed. A series of examples are given to indicate how structural optimization has been effectively integrated into the design process.
Automatic Layout Design for Power Module
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, Puqi; Wang, Fei; Ngo, Khai
The layout of power modules is one of the most important elements in power module design, especially for high power densities, where couplings are increased. In this paper, an automatic design process using a genetic algorithm is presented. Some practical considerations are introduced in the optimization of the layout design of the module. This paper presents a process for automatic layout design for high power density modules. Detailed GA implementations are introduced both for outer loop and inner loop. As verified by a design example, the results of the automatic design process presented here are better than those from manualmore » design and also better than the results from a popular design software. This automatic design procedure could be a major step toward improving the overall performance of future layout design.« less
Design or "Design"--Envisioning a Future Design Education
ERIC Educational Resources Information Center
Sless, David
2012-01-01
Challenging the common grand vision of Design, this article considers "design" as a humble re-forming process based on evidence to substantiate its results. The designer is likened to a tinker who respects previous iterations of a design and seeks to retain what is useful while improving its performance. A design process is offered,…
A systems-based approach for integrated design of materials, products and design process chains
NASA Astrophysics Data System (ADS)
Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh
2007-12-01
The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.
Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj
2015-01-01
Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.
2007-07-01
been put into place to guide the standards process. 6. If the balloting results in 75% approval then the draft standard is sub- mitted to the IEEE-SA...as functionality and timeliness. Such a design process presumably guided the design for the AMRFC test bed. The multifunction apertures for...Integrated Topside should be guided by the same design process. Engaging in a spiral design process will lead to the most effective selection of research
Study on Product Innovative Design Process Driven by Ideal Solution
NASA Astrophysics Data System (ADS)
Zhang, Fuying; Lu, Ximei; Wang, Ping; Liu, Hui
Product innovative design in companies today relies heavily on individual members’ experience and creative ideation as well as their skills of integrating creativity and innovation tools with design methods agilely. Creative ideation and inventive ideas generation are two crucial stages in product innovative design process. Ideal solution is the desire final ideas for given problem, and the striving reaching target for product design. In this paper, a product innovative design process driven by ideal solution is proposed. This design process encourages designers to overcome their psychological inertia, to foster creativity in a systematic way for acquiring breakthrough creative and innovative solutions in a reducing sphere of solution-seeking, and results in effective product innovative design rapidly. A case study example is also presented to illustrate the effectiveness of the proposed design process.
[Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].
Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin
2017-07-01
In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.
Conceptual Chemical Process Design for Sustainability. ...
This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews
Regression analysis as a design optimization tool
NASA Technical Reports Server (NTRS)
Perley, R.
1984-01-01
The optimization concepts are described in relation to an overall design process as opposed to a detailed, part-design process where the requirements are firmly stated, the optimization criteria are well established, and a design is known to be feasible. The overall design process starts with the stated requirements. Some of the design criteria are derived directly from the requirements, but others are affected by the design concept. It is these design criteria that define the performance index, or objective function, that is to be minimized within some constraints. In general, there will be multiple objectives, some mutually exclusive, with no clear statement of their relative importance. The optimization loop that is given adjusts the design variables and analyzes the resulting design, in an iterative fashion, until the objective function is minimized within the constraints. This provides a solution, but it is only the beginning. In effect, the problem definition evolves as information is derived from the results. It becomes a learning process as we determine what the physics of the system can deliver in relation to the desirable system characteristics. As with any learning process, an interactive capability is a real attriubute for investigating the many alternatives that will be suggested as learning progresses.
NASA Astrophysics Data System (ADS)
Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji
Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.
Henk, Henry J; Li, Xiaoyan; Becker, Laura K; Xu, Hairong; Gong, Qi; Deeter, Robert G; Barron, Richard L
2015-01-01
To examine the impact of research design on results in two published comparative effectiveness studies. Guidelines for comparative effectiveness research have recommended incorporating disease process in study design. Based on the recommendations, we develop a checklist of considerations and apply the checklist in review of two published studies on comparative effectiveness of colony-stimulating factors. Both studies used similar administrative claims data, but different methods, which resulted in directionally different estimates. Major design differences between the two studies include: whether the timing of intervention in disease process was identified and whether study cohort and outcome assessment period were defined based on this temporal relationship. Disease process and timing of intervention should be incorporated into the design of comparative effectiveness studies.
Automated CAD design for sculptured airfoil surfaces
NASA Astrophysics Data System (ADS)
Murphy, S. D.; Yeagley, S. R.
1990-11-01
The design of tightly tolerated sculptured surfaces such as those for airfoils requires a significant design effort in order to machine the tools to create these surfaces. Because of the quantity of numerical data required to describe the airfoil surfaces, a CAD approach is required. Although this approach will result in productivity gains, much larger gains can be achieved by automating the design process. This paper discusses an application which resulted in an eightfold improvement in productivity by automating the design process on the CAD system.
NASA Astrophysics Data System (ADS)
Huhn, Stefan; Peeling, Derek; Burkart, Maximilian
2017-10-01
With the availability of die face design tools and incremental solver technologies to provide detailed forming feasibility results in a timely fashion, the use of inverse solver technologies and resulting process improvements during the product development process of stamped parts often is underestimated. This paper presents some applications of inverse technologies that are currently used in the automotive industry to streamline the product development process and greatly increase the quality of a developed process and the resulting product. The first focus is on the so-called target strain technology. Application examples will show how inverse forming analysis can be applied to support the process engineer during the development of a die face geometry for Class `A' panels. The drawing process is greatly affected by the die face design and the process designer has to ensure that the resulting drawn panel will meet specific requirements regarding surface quality and a minimum strain distribution to ensure dent resistance. The target strain technology provides almost immediate feedback to the process engineer during the die face design process if a specific change of the die face design will help to achieve these specific requirements or will be counterproductive. The paper will further show how an optimization of the material flow can be achieved through the use of a newly developed technology called Sculptured Die Face (SDF). The die face generation in SDF is more suited to be used in optimization loops than any other conventional die face design technology based on cross section design. A second focus in this paper is on the use of inverse solver technologies for secondary forming operations. The paper will show how the application of inverse technology can be used to accurately and quickly develop trim lines on simple as well as on complex support geometries.
Analysis of digester design concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashare, E.; Wilson, E. H.
1979-01-29
Engineering economic analyses were performed on various digester design concepts to determine the relative performance for various biomass feedstocks. A comprehensive literature survey describing the state-of-the-art of the various digestion designs is included. The digester designs included in the analyses are CSTR, plug flow, batch, CSTR in series, multi-stage digestion and biomethanation. Other process options investigated included pretreatment processes such as shredding, degritting, and chemical pretreatment, and post-digestion processes, such as dewatering and gas purification. The biomass sources considered include feedlot manure, rice straw, and bagasse. The results of the analysis indicate that the most economical (on a unit gasmore » cost basis) digester design concept is the plug flow reactor. This conclusion results from this system providing a high gas production rate combined with a low capital hole-in-the-ground digester design concept. The costs determined in this analysis do not include any credits or penalties for feedstock or by-products, but present the costs only for conversion of biomass to methane. The batch land-fill type digester design was shown to have a unit gas cost comparable to that for a conventional stirred tank digester, with the potential of reducing the cost if a land-fill site were available for a lower cost per unit volume. The use of chemical pretreatment resulted in a higher unit gas cost, primarily due to the cost of pretreatment chemical. A sensitivity analysis indicated that the use of chemical pretreatment could improve the economics provided a process could be developed which utilized either less pretreatment chemical or a less costly chemical. The use of other process options resulted in higher unit gas costs. These options should only be used when necessary for proper process performance, or to result in production of a valuable by-product.« less
The Architectural and Interior Design Planning Process.
ERIC Educational Resources Information Center
Cohen, Elaine
1994-01-01
Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…
Silicon CMOS optical receiver circuits with integrated thin-film compound semiconductor detectors
NASA Astrophysics Data System (ADS)
Brooke, Martin A.; Lee, Myunghee; Jokerst, Nan Marie; Camperi-Ginestet, C.
1995-04-01
While many circuit designers have tackled the problem of CMOS digital communications receiver design, few have considered the problem of circuitry suitable for an all CMOS digital IC fabrication process. Faced with a high speed receiver design the circuit designer will soon conclude that a high speed analog-oriented fabrication process provides superior performance advantages to a digital CMOS process. However, for applications where there are overwhelming reasons to integrate the receivers on the same IC as large amounts of conventional digital circuitry, the low yield and high cost of the exotic analog-oriented fabrication is no longer an option. The issues that result from a requirement to use a digital CMOS IC process cut across all aspects of receiver design, and result in significant differences in circuit design philosophy and topology. Digital ICs are primarily designed to yield small, fast CMOS devices for digital logic gates, thus no effort is put into providing accurate or high speed resistances, or capacitors. This lack of any reliable resistance or capacitance has a significant impact on receiver design. Since resistance optimization is not a prerogative of the digital IC process engineer, the wisest option is thus to not use these elements, opting instead for active circuitry to replace the functions normally ascribed to resistance and capacitance. Depending on the application receiver noise may be a dominant design constraint. The noise performance of CMOS amplifiers is different than bipolar or GaAs MESFET circuits, shot noise is generally insignificant when compared to channel thermal noise. As a result the optimal input stage topology is significantly different for the different technologies. It is found that, at speeds of operation approaching the limits of the digital CMOS process, open loop designs have noise-power-gain-bandwidth tradeoff performance superior to feedback designs. Furthermore, the lack of good resisters and capacitors complicates the use of feedback circuits. Thus feedback is generally not used in the front-end of our digital process CMOS receivers.
Software synthesis using generic architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay
1993-01-01
A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.
Design, Control and in Situ Visualization of Gas Nitriding Processes
Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy
2010-01-01
The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536
Accounting for Proof Test Data in a Reliability Based Design Optimization Framework
NASA Technical Reports Server (NTRS)
Ventor, Gerharad; Scotti, Stephen J.
2012-01-01
This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.
Engineering design: A cognitive process approach
NASA Astrophysics Data System (ADS)
Strimel, Greg Joseph
The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the research objectives of this study. Two independent coders then coded the video/audio recordings and the additional design data using Halfin's (1973) 17 mental processes for technological problem-solving. The results of this study indicated that the participants employed a wide array of mental processes when solving engineering design challenges. However, the findings provide a general analysis of the number of times participants employed each mental process, as well as the amount of time consumed employing the various mental processes through the different stages of the engineering design process. The results indicated many similarities between the students solving the problem, which may highlight voids in current technology and engineering education curricula. Additionally, the findings showed differences between the processes employed by participants that created the most successful solutions and the participants who developed the least effective solutions. Upon comparing and contrasting these processes, recommendations for instructional strategies to enhance a student's capability for solving engineering design problems were developed. The results also indicated that students, when left without teacher intervention, use a simplified and more natural process to solve design challenges than the 12-step engineering design process reported in much of the literature. Lastly, these data indicated that students followed two different approaches to solving the design problem. Some students employed a sequential and logical approach, while others employed a nebulous, solution centered trial-and-error approach to solving the problem. In this study the participants who were more sequential had better performing solutions. Examining these two approaches and the student cognition data enabled the researcher to generate a conceptual engineering design model for the improved teaching and development of engineering design problem solving.
NASA Astrophysics Data System (ADS)
Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun
2016-05-01
In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.
Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Dodd, H.R.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.
2003-01-01
Four sampling designs for quantifying the effect of low-head sea lamprey (Petromyzon marinus) barriers on fish communities were evaluated, and the contribution of process-oriented research to the overall confidence of results obtained was discussed. The designs include: (1) sample barrier streams post-construction; (2) sample barrier and reference streams post-construction; (3) sample barrier streams pre- and post-construction; and (4) sample barrier and reference streams pre- and post-construction. In the statistical literature, the principal basis for comparison of sampling designs is generally the precision achieved by each design. In addition to precision, designs should be compared based on the interpretability of results and on the scale to which the results apply. Using data collected in a broad survey of streams with and without sea lamprey barriers, some of the tradeoffs that occur among precision, scale, and interpretability are illustrated. Although circumstances such as funding and availability of pre-construction data may limit which design can be implemented, a pre/post-construction design including barrier and reference streams provides the most meaningful information for use in barrier management decisions. Where it is not feasible to obtain pre-construction data, a design including reference streams is important to maintain the interpretability of results. Regardless of the design used, process-oriented research provides a framework for interpreting results obtained in broad surveys. As such, information from both extensive surveys and intensive process-oriented research provides the best basis for fishery management actions, and gives researchers and managers the most confidence in the conclusions reached regarding the effects of sea lamprey barriers.
The amount of ergonomics and user involvement in 151 design processes.
Kok, Barbara N E; Slegers, Karin; Vink, Peter
2012-01-01
Ergonomics, usability and user-centered design are terms that are well known among designers. Yet, products often seem to fail to meet the users' needs, resulting in a gap between expected and experienced usability. To understand the possible causes of this gap the actions taken by the designer during the design process are studied in this paper. This can show whether and how certain actions influence the user-friendliness of the design products. The aim of this research was to understand whether ergonomic principles and methods are included in the design process, whether users are involved in this process and whether the experience of the designer (in ergonomics/user involvement) has an effect on the end product usability. In this study the design processes of 151 tangible products of students in design were analyzed. It showed that in 75% of the cases some ergonomic principles were applied. User involvement was performed in only 1/3 of the design cases. Hardly any correlation was found between the designers' experience in ergonomic principles and the way they applied it and no correlations were found between the designers' experience in user involvement and the users' involvement in the design process.
Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design
NASA Astrophysics Data System (ADS)
Koga, Tsuyoshi; Aoyama, Kazuhiro
This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.
Reducing the complexity of the software design process with object-oriented design
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.
CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.F. Beesley
The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Design Steps for Physic STEM Education Learning in Secondary School
NASA Astrophysics Data System (ADS)
Teevasuthonsakul, C.; Yuvanatheeme, V.; Sriput, V.; Suwandecha, S.
2017-09-01
This study aimed to develop the process of STEM Education activity design used in Physics subjects in the Thai secondary schools. The researchers have conducted the study by reviewing the literature and related works, interviewing Physics experts, designing and revising the process accordingly, and experimenting the designed process in actual classrooms. This brought about the five-step process of STEM Education activity design which Physics teachers applied to their actual teaching context. The results from the after-class evaluation revealed that the students’ satisfaction level toward Physics subject and critical thinking skill was found higher statistically significant at p < .05. Moreover, teachers were advised to integrate the principles of science, mathematics, technology, and engineering design process as the foundation when creating case study of problems and solutions.
Simplify to survive: prescriptive layouts ensure profitable scaling to 32nm and beyond
NASA Astrophysics Data System (ADS)
Liebmann, Lars; Pileggi, Larry; Hibbeler, Jason; Rovner, Vyacheslav; Jhaveri, Tejas; Northrop, Greg
2009-03-01
The time-to-market driven need to maintain concurrent process-design co-development, even in spite of discontinuous patterning, process, and device innovation is reiterated. The escalating design rule complexity resulting from increasing layout sensitivities in physical and electrical yield and the resulting risk to profitable technology scaling is reviewed. Shortcomings in traditional Design for Manufacturability (DfM) solutions are identified and contrasted to the highly successful integrated design-technology co-optimization used for SRAM and other memory arrays. The feasibility of extending memory-style design-technology co-optimization, based on a highly simplified layout environment, to logic chips is demonstrated. Layout density benefits, modeled patterning and electrical yield improvements, as well as substantially improved layout simplicity are quantified in a conventional versus template-based design comparison on a 65nm IBM PowerPC 405 microprocessor core. The adaptability of this highly regularized template-based design solution to different yield concerns and design styles is shown in the extension of this work to 32nm with an increased focus on interconnect redundancy. In closing, the work not covered in this paper, focused on the process side of the integrated process-design co-optimization, is introduced.
Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity
Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.
2010-01-01
An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183
Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.
Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L
2010-02-01
An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.
NASA Astrophysics Data System (ADS)
Boski, Marcin; Paszke, Wojciech
2015-11-01
This paper deals with the problem of designing an iterative learning control algorithm for discrete linear systems using repetitive process stability theory. The resulting design produces a stabilizing output feedback controller in the time domain and a feedforward controller that guarantees monotonic convergence in the trial-to-trial domain. The results are also extended to limited frequency range design specification. New design procedure is introduced in terms of linear matrix inequality (LMI) representations, which guarantee the prescribed performances of ILC scheme. A simulation example is given to illustrate the theoretical developments.
NASA Astrophysics Data System (ADS)
Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.
2017-10-01
The paper considers an important problem of designing distributed systems of hydrolithosphere processes management. The control actions on the hydrolithosphere processes under consideration are implemented by a set of extractive wells. The article shows the method of defining the approximation links for description of the dynamic characteristics of hydrolithosphere processes. The structure of distributed regulators, used in the management systems by the considered processes, is presented. The paper analyses the results of the synthesis of the distributed management system and the results of modelling the closed-loop control system by the parameters of the hydrolithosphere process.
Manufacturing process design for multi commodities in agriculture
NASA Astrophysics Data System (ADS)
Prasetyawan, Yudha; Santosa, Andrian Henry
2017-06-01
High-potential commodities within particular agricultural sectors should be accompanied by maximum benefit value that can be attained by both local farmers and business players. In several cases, the business players are small-medium enterprises (SMEs) which have limited resources to perform added value process of the local commodities into the potential products. The weaknesses of SMEs such as the manual production process with low productivity, limited capacity to maintain prices, and unattractive packaging due to conventional production. Agricultural commodity is commonly created into several products such as flour, chips, crackers, oil, juice, and other products. This research was initiated by collecting data by interview method particularly to obtain the perspectives of SMEs as the business players. Subsequently, the information was processed based on the Quality Function Deployment (QFD) to determine House of Quality from the first to fourth level. A proposed design as the result of QFD was produced and evaluated with Technology Assessment Model (TAM) and continued with a revised design. Finally, the revised design was analyzed with financial perspective to obtain the cost structure of investment, operational, maintenance, and workers. The machine that performs manufacturing process, as the result of revised design, was prototyped and tested to determined initial production process. The designed manufacturing process offers IDR 337,897, 651 of Net Present Value (NPV) in comparison with the existing process value of IDR 9,491,522 based on similar production input.
Developing Elementary Math and Science Process Skills Through Engineering Design Instruction
NASA Astrophysics Data System (ADS)
Strong, Matthew G.
This paper examines how elementary students can develop math and science process skills through an engineering design approach to instruction. The performance and development of individual process skills overall and by gender were also examined. The study, preceded by a pilot, took place in a grade four extracurricular engineering design program in a public, suburban school district. Students worked in pairs and small groups to design and construct airplane models from styrofoam, paper clips, and toothpicks. The development and performance of process skills were assessed through a student survey of learning gains, an engineering design packet rubric (student work), observation field notes, and focus group notes. The results indicate that students can significantly develop process skills, that female students may develop process skills through engineering design better than male students, and that engineering design is most helpful for developing the measuring, suggesting improvements, and observing process skills. The study suggests that a more regular engineering design program or curriculum could be beneficial for students' math and science abilities both in this school and for the elementary field as a whole.
Role of Process Control in Improving Space Vehicle Safety A Space Shuttle External Tank Example
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Nguyen, Son C.; Burleson, Keith W.
2006-01-01
Developing a safe and reliable space vehicle requires good design and good manufacturing, or in other words "design it right and build it right". A great design can be hard to build or manufacture mainly due to difficulties related to quality. Specifically, process control can be a challenge. As a result, the system suffers from low quality which leads to low reliability and high system risk. The Space Shuttle has experienced some of those cases, but has overcome these difficulties through extensive redesign efforts and process enhancements. One example is the design of the hot gas temperature sensor on the Space Shuttle Main Engine (SSME), which resulted in failure of the sensor in flight and led to a redesign of the sensor. The most recent example is the Space Shuttle External Tank (ET) Thermal Protection System (TPS) reliability issues that contributed to the Columbia accident. As a result, extensive redesign and process enhancement activities have been performed over the last two years to minimize the sensitivities and difficulties of the manual TPS application process.
Shape design of an optimal comfortable pillow based on the analytic hierarchy process method
Liu, Shuo-Fang; Lee, Yann-Long; Liang, Jung-Chin
2011-01-01
Objective Few studies have analyzed the shapes of pillows. The purpose of this study was to investigate the relationship between the pillow shape design and subjective comfort level for asymptomatic subjects. Methods Four basic pillow designs factors were selected on the basis of literature review and recombined into 8 configurations for testing the rank of degrees of comfort. The data were analyzed by the analytic hierarchy process method to determine the most comfortable pillow. Results Pillow number 4 was the most comfortable pillow in terms of head, neck, shoulder, height, and overall comfort. The design factors of pillow number 4 were using a combination of standard, cervical, and shoulder pillows. A prototype of this pillow was developed on the basis of the study results for designing future pillow shapes. Conclusions This study investigated the comfort level of particular users and redesign features of a pillow. A deconstruction analysis would simplify the process of determining the most comfortable pillow design and aid designers in designing pillows for groups. PMID:22654680
Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin; Anderson, Molly
2011-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA) that were developed using the Aspen Custom Modeler and Aspen Plus process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.
Granularity as a Cognitive Factor in the Effectiveness of Business Process Model Reuse
NASA Astrophysics Data System (ADS)
Holschke, Oliver; Rake, Jannis; Levina, Olga
Reusing design models is an attractive approach in business process modeling as modeling efficiency and quality of design outcomes may be significantly improved. However, reusing conceptual models is not a cost-free effort, but has to be carefully designed. While factors such as psychological anchoring and task-adequacy in reuse-based modeling tasks have been investigated, information granularity as a cognitive concept has not been at the center of empirical research yet. We hypothesize that business process granularity as a factor in design tasks under reuse has a significant impact on the effectiveness of resulting business process models. We test our hypothesis in a comparative study employing high and low granularities. The reusable processes provided were taken from widely accessible reference models for the telecommunication industry (enhanced Telecom Operations Map). First experimental results show that Recall in tasks involving coarser granularity is lower than in cases of finer granularity. These findings suggest that decision makers in business process management should be considerate with regard to the implementation of reuse mechanisms of different granularities. We realize that due to our small sample size results are not statistically significant, but this preliminary run shows that it is ready for running on a larger scale.
Aviation System Analysis Capability Executive Assistant Design
NASA Technical Reports Server (NTRS)
Roberts, Eileen; Villani, James A.; Osman, Mohammed; Godso, David; King, Brent; Ricciardi, Michael
1998-01-01
In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment.
NASA Astrophysics Data System (ADS)
Kobayashi, Takashi; Komoda, Norihisa
The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.
A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes
NASA Technical Reports Server (NTRS)
Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw
2004-01-01
There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.
Optimization in the systems engineering process
NASA Technical Reports Server (NTRS)
Lemmerman, Loren A.
1993-01-01
The essential elements of the design process consist of the mission definition phase that provides the system requirements, the conceptual design, the preliminary design and finally the detailed design. Mission definition is performed largely by operations analysts in conjunction with the customer. The result of their study is handed off to the systems engineers for documentation as the systems requirements. The document that provides these requirements is the basis for the further design work of the design engineers at the Lockheed-Georgia Company. The design phase actually begins with conceptual design, which is generally conducted by a small group of engineers using multidisciplinary design programs. Because of the complexity of the design problem, the analyses are relatively simple and generally dependent on parametric analyses of the configuration. The result of this phase is a baseline configuration from which preliminary design may be initiated.
Learning Design Rashomon I--Supporting the Design of One Lesson through Different Approaches
ERIC Educational Resources Information Center
Persico, Donatella; Pozzi, Francesca; Anastopoulou, Stamatina; Conole, Grainne; Craft, Brock; Dimitriadis, Yannis; Hernandez-Leo, Davinia; Kali, Yael; Mor, Yishay; Perez-Sanagustin, Mar; Walmsley, Helen
2013-01-01
This paper presents and compares a variety of approaches that have been developed to guide the decision-making process in learning design. Together with the companion Learning Design Rashomon II (Prieto "et al.," 2013), devoted to existing tools to support the same process, it aims to provide a view on relevant research results in this…
NASA Astrophysics Data System (ADS)
Sirirojvisuth, Apinut
In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stepinski, Dominique C.; Youker, Amanda J.; Krahn, Elizabeth O.
2017-03-01
Molybdenum-99 is a parent of the most widely used medical isotope technetium-99m. Proliferation concerns have prompted development of alternative Mo production methods utilizing low enriched uranium. Alumina and titania sorbents were evaluated for separation of Mo from concentrated uranyl nitrate solutions. System, mass transfer, and isotherm parameters were determined to enable design of Mo separation processes under a wide range of conditions. A model-based approach was utilized to design representative commercial-scale column processes. The designs and parameters were verified with bench-scale experiments. The results are essential for design of Mo separation processes from irradiated uranium solutions, selection of support materialmore » and process optimization. Mo uptake studies show that adsorption decreases with increasing concentration of uranyl nitrate; howeveL, examination of Mo adsorption as a function of nitrate ion concentration shows no dependency, indicating that uranium competes with Mo for adsorption sites. These results are consistent with reports indicating that Mo forms inner-sphere complexes with titania and alumina surface groups.« less
Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M
2009-10-15
A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.
Jaton, Florian
2017-01-01
This article documents the practical efforts of a group of scientists designing an image-processing algorithm for saliency detection. By following the actors of this computer science project, the article shows that the problems often considered to be the starting points of computational models are in fact provisional results of time-consuming, collective and highly material processes that engage habits, desires, skills and values. In the project being studied, problematization processes lead to the constitution of referential databases called ‘ground truths’ that enable both the effective shaping of algorithms and the evaluation of their performances. Working as important common touchstones for research communities in image processing, the ground truths are inherited from prior problematization processes and may be imparted to subsequent ones. The ethnographic results of this study suggest two complementary analytical perspectives on algorithms: (1) an ‘axiomatic’ perspective that understands algorithms as sets of instructions designed to solve given problems computationally in the best possible way, and (2) a ‘problem-oriented’ perspective that understands algorithms as sets of instructions designed to computationally retrieve outputs designed and designated during specific problematization processes. If the axiomatic perspective on algorithms puts the emphasis on the numerical transformations of inputs into outputs, the problem-oriented perspective puts the emphasis on the definition of both inputs and outputs. PMID:28950802
Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M
2016-09-01
Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.
Comparing Freshman and doctoral engineering students in design: mapping with a descriptive framework
NASA Astrophysics Data System (ADS)
Carmona Marques, P.
2017-11-01
This paper reports the results of a study of engineering students' approaches to an open-ended design problem. To carry out this, sketches and interviews were collected from 9 freshmen (first year) and 10 doctoral engineering students, when they designed solutions for orange squeezers. Sketches and interviews were analysed and mapped with a descriptive 'ideation framework' (IF) of the design process, to document and compare their design creativity (Carmona Marques, P., A. Silva, E. Henriques, and C. Magee. 2014. "A Descriptive Framework of the Design Process from a Dual Cognitive Engineering Perspective." International Journal of Design Creativity and Innovation 2 (3): 142-164). The results show that the designers worked in a manner largely consistent with the IF for generalisation and specialisation loops. Also, doctoral students produced more alternative solutions during the ideation process. In addition, compared to freshman, doctoral used the generalisation loop of the IF, working at higher levels of abstraction. The iterative nature of design is highlighted during this study - a potential contribution to decrease the gap between both groups in engineering education.
Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V
2012-10-01
A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.
Uncertainty in eddy covariance flux estimates resulting from spectral attenuation [Chapter 4
W. J. Massman; R. Clement
2004-01-01
Surface exchange fluxes measured by eddy covariance tend to be underestimated as a result of limitations in sensor design, signal processing methods, and finite flux-averaging periods. But, careful system design, modern instrumentation, and appropriate data processing algorithms can minimize these losses, which, if not too large, can be estimated and corrected using...
Boston-Fleischhauer, Carol
2008-02-01
The demand to redesign healthcare processes that achieve efficient, effective, and safe results is never-ending. Part 1 of this 2-part series introduced human factors engineering and reliability science as important knowledge to enhance existing operational and clinical process design methods in healthcare organizations. In part 2, the author applies this knowledge to one of the most common operational processes in healthcare: clinical documentation. Specific implementation strategies and anticipated results are discussed, along with organizational challenges and recommended executive responses.
NASA Technical Reports Server (NTRS)
Welstead, Jason; Crouse, Gilbert L., Jr.
2014-01-01
Empirical sizing guidelines such as tail volume coefficients have long been used in the early aircraft design phases for sizing stabilizers, resulting in conservatively stable aircraft. While successful, this results in increased empty weight, reduced performance, and greater procurement and operational cost relative to an aircraft with optimally sized surfaces. Including flight dynamics in the conceptual design process allows the design to move away from empirical methods while implementing modern control techniques. A challenge of flight dynamics and control is the numerous design variables, which are changing fluidly throughout the conceptual design process, required to evaluate the system response to some disturbance. This research focuses on addressing that challenge not by implementing higher order tools, such as computational fluid dynamics, but instead by linking the lower order tools typically used within the conceptual design process so each discipline feeds into the other. In thisresearch, flight dynamics and control was incorporated into the conceptual design process along with the traditional disciplines of vehicle sizing, weight estimation, aerodynamics, and performance. For the controller, a linear quadratic regulator structure with constant gains has been specified to reduce the user input. Coupling all the disciplines in the conceptual design phase allows the aircraft designer to explore larger design spaces where stabilizers are sized according to dynamic response constraints rather than historical static margin and volume coefficient guidelines.
Empirical studies of software design: Implications for SSEs
NASA Technical Reports Server (NTRS)
Krasner, Herb
1988-01-01
Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.
Evaluating two process scale chromatography column header designs using CFD.
Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris
2014-01-01
Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. © 2014 American Institute of Chemical Engineers.
Human Engineering of Space Vehicle Displays and Controls
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Holden, Kritina L.; Boyer, Jennifer; Stephens, John-Paul; Ezer, Neta; Sandor, Aniko
2010-01-01
Proper attention to the integration of the human needs in the vehicle displays and controls design process creates a safe and productive environment for crew. Although this integration is critical for all phases of flight, for crew interfaces that are used during dynamic phases (e.g., ascent and entry), the integration is particularly important because of demanding environmental conditions. This panel addresses the process of how human engineering involvement ensures that human-system integration occurs early in the design and development process and continues throughout the lifecycle of a vehicle. This process includes the development of requirements and quantitative metrics to measure design success, research on fundamental design questions, human-in-the-loop evaluations, and iterative design. Processes and results from research on displays and controls; the creation and validation of usability, workload, and consistency metrics; and the design and evaluation of crew interfaces for NASA's Crew Exploration Vehicle are used as case studies.
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
Launch Vehicle Design Process Description and Training Formulation
NASA Technical Reports Server (NTRS)
Atherton, James; Morris, Charles; Settle, Gray; Teal, Marion; Schuerer, Paul; Blair, James; Ryan, Robert; Schutzenhofer, Luke
1999-01-01
A primary NASA priority is to reduce the cost and improve the effectiveness of launching payloads into space. As a consequence, significant improvements are being sought in the effectiveness, cost, and schedule of the launch vehicle design process. In order to provide a basis for understanding and improving the current design process, a model has been developed for this complex, interactive process, as reported in the references. This model requires further expansion in some specific design functions. Also, a training course for less-experienced engineers is needed to provide understanding of the process, to provide guidance for its effective implementation, and to provide a basis for major improvements in launch vehicle design process technology. The objective of this activity is to expand the description of the design process to include all pertinent design functions, and to develop a detailed outline of a training course on the design process for launch vehicles for use in educating engineers whose experience with the process has been minimal. Building on a previously-developed partial design process description, parallel sections have been written for the Avionics Design Function, the Materials Design Function, and the Manufacturing Design Function. Upon inclusion of these results, the total process description will be released as a NASA TP. The design function sections herein include descriptions of the design function responsibilities, interfaces, interactive processes, decisions (gates), and tasks. Associated figures include design function planes, gates, and tasks, along with other pertinent graphics. Also included is an expanded discussion of how the design process is divided, or compartmentalized, into manageable parts to achieve efficient and effective design. A detailed outline for an intensive two-day course on the launch vehicle design process has been developed herein, and is available for further expansion. The course is in an interactive lecture/workshop format to engage the participants in active learning. The course addresses the breadth and depth of the process, requirements, phases, participants, multidisciplinary aspects, tasks, critical elements,as well as providing guidance from previous lessons learned. The participants are led to develop their own understanding of the current process and how it can be improved. Included are course objectives and a session-by-session outline of course content. Also included is an initial identification of visual aid requirements.
Hussein, Husnah; Williams, David J; Liu, Yang
2015-07-01
A systematic design of experiments (DOE) approach was used to optimize the perfusion process of a tri-axial bioreactor designed for translational tissue engineering exploiting mechanical stimuli and mechanotransduction. Four controllable design parameters affecting the perfusion process were identified in a cause-effect diagram as potential improvement opportunities. A screening process was used to separate out the factors that have the largest impact from the insignificant ones. DOE was employed to find the settings of the platen design, return tubing configuration and the elevation difference that minimise the load on the pump and variation in the perfusion process and improve the controllability of the perfusion pressures within the prescribed limits. DOE was very effective for gaining increased knowledge of the perfusion process and optimizing the process for improved functionality. It is hypothesized that the optimized perfusion system will result in improved biological performance and consistency.
NASA Technical Reports Server (NTRS)
Anderson, Frederick; Biezad, Daniel J.
1994-01-01
This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.
Double jeopardy in inferring cognitive processes
Fific, Mario
2014-01-01
Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2n. In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs. PMID:25374545
Conceptual design of flapping-wing micro air vehicles.
Whitney, J P; Wood, R J
2012-09-01
Traditional micro air vehicles (MAVs) are miniature versions of full-scale aircraft from which their design principles closely follow. The first step in aircraft design is the development of a conceptual design, where basic specifications and vehicle size are established. Conceptual design methods do not rely on specific knowledge of the propulsion system, vehicle layout and subsystems; these details are addressed later in the design process. Non-traditional MAV designs based on birds or insects are less common and without well-established conceptual design methods. This paper presents a conceptual design process for hovering flapping-wing vehicles. An energy-based accounting of propulsion and aerodynamics is combined with a one degree-of-freedom dynamic flapping model. Important results include simple analytical expressions for flight endurance and range, predictions for maximum feasible wing size and body mass, and critical design space restrictions resulting from finite wing inertia. A new figure-of-merit for wing structural-inertial efficiency is proposed and used to quantify the performance of real and artificial insect wings. The impact of these results on future flapping-wing MAV designs is discussed in detail.
Silicon production process evaluations
NASA Technical Reports Server (NTRS)
1981-01-01
Chemical engineering analysis of the HSC process (Hemlock Semiconductor Corporation) for producing silicon from dichlorosilane in a 1,000 MT/yr plant was continued. Progress and status for the chemical engineering analysis of the HSC process are reported for the primary process design engineering activities: base case conditions (85%), reaction chemistry (85%), process flow diagram (60%), material balance (60%), energy balance (30%), property data (30%), equipment design (20%) and major equipment list (10%). Engineering design of the initial distillation column (D-01, stripper column) in the process was initiated. The function of the distillation column is to remove volatile gases (such as hydrogen and nitrogen) which are dissolved in liquid chlorosilanes. Initial specifications and results for the distillation column design are reported including the variation of tray requirements (equilibrium stages) with reflux ratio for the distillation.
Practicing universal design to actual hand tool design process.
Lin, Kai-Chieh; Wu, Chih-Fu
2015-09-01
UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.
2012-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA). These dynamic models were developed using the Aspen Custom Modeler (Registered TradeMark) and Aspen Plus(Registered TradeMark) process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.
Analysis of the influence of manufacturing and alignment related errors on an optical tweezer system
NASA Astrophysics Data System (ADS)
Kampmann, R.; Sinzinger, S.
2014-12-01
In this work we present the design process as well as experimental results of an optical system for trapping particles in air. For positioning applications of micro-sized objects onto a glass wafer we developed a highly efficient optical tweezer. The focus of this paper is the iterative design process where we combine classical optics design software with a ray optics based force simulation tool. Thus we can find the best compromise which matches the optical systems restrictions with stable trapping conditions. Furthermore we analyze the influence of manufacturing related tolerances and errors in the alignment process of the optical elements on the optical forces. We present the design procedure for the necessary optical elements as well as experimental results for the aligned system.
A Phenomenological Research Study on Writer's Block: Causes, Processes, and Results
ERIC Educational Resources Information Center
Bastug, Muhammet; Ertem, Ihsan Seyit; Keskin, Hasan Kagan
2017-01-01
Purpose: The purpose of this paper is to investigate the causes, processes of writer's block experienced by a group of classroom teacher candidates and its impact on them. Design/methodology/approach: The phenomenological design, which is a qualitative research design, was preferred in the research since it was aimed to investigate the causes,…
ERIC Educational Resources Information Center
Colpaert, Jozef
2006-01-01
The term "design" is being understood more and more as a methodological process, together with its acceptance as the result of such a process. As a process, it is a stage in the courseware engineering life cycle which primarily focuses on rendering the development process more effective and on enhancing the qualities of the finished system,…
Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates
NASA Astrophysics Data System (ADS)
Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki
2018-04-01
We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.
Mitchell, Peter D; Ratcliffe, Elizabeth; Hourd, Paul; Williams, David J; Thomas, Robert J
2014-12-01
It is well documented that cryopreservation and resuscitation of human embryonic stem cells (hESCs) is complex and ill-defined, and often suffers poor cell recovery and increased levels of undesirable cell differentiation. In this study we have applied Quality-by-Design (QbD) concepts to the critical processes of slow-freeze cryopreservation and resuscitation of hESC colony cultures. Optimized subprocesses were linked together to deliver a controlled complete process. We have demonstrated a rapid, high-throughput, and stable system for measurement of cell adherence and viability as robust markers of in-process and postrecovery cell state. We observed that measurement of adherence and viability of adhered cells at 1 h postseeding was predictive of cell proliferative ability up to 96 h in this system. Application of factorial design defined the operating spaces for cryopreservation and resuscitation, critically linking the performance of these two processes. Optimization of both processes resulted in enhanced reattachment and post-thaw viability, resulting in substantially greater recovery of cryopreserved, pluripotent cell colonies. This study demonstrates the importance of QbD concepts and tools for rapid, robust, and low-risk process design that can inform manufacturing controls and logistics.
Analysis of Work Design in Rubber Processing Plant
NASA Astrophysics Data System (ADS)
Wahyuni, Dini; Nasution, Harmein; Budiman, Irwan; Wijaya, Khairini
2018-02-01
The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers' health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.
Selection process for trade study: Reusable Hydrogen Composite Tank System (RHCTS)
NASA Astrophysics Data System (ADS)
Greenberg, H. S.
1994-09-01
This document describes the selection process that will be used to identify the most suitable structural configuration option for an SSTO winged vehicle capable of delivering 25,000 lbs to a 220 nm circular orbit at 51.6 degree inclination. The most suitable RHCTS is within this configuration and will be the prototype design for subsequent design and analysis and the basis for the design and fabrication of a scale test article to be subjected to life cycle testing. The selection process for this TA 1 trade study is the same as that for the TA 2 trade study. As the trade study progresses additional insight may result in modifications to the selection criteria within in this process. Such modifications will result in an update of this document as appropriate.
Using system dynamics for collaborative design: a case study
Elf, Marie; Putilova, Mariya; von Koch, Lena; Öhrn, Kerstin
2007-01-01
Background In order to facilitate the collaborative design, system dynamics (SD) with a group modelling approach was used in the early stages of planning a new stroke unit. During six workshops a SD model was created in a multiprofessional group. Aim To explore to which extent and how the use of system dynamics contributed to the collaborative design process. Method A case study was conducted using several data sources. Results SD supported a collaborative design, by facilitating an explicit description of stroke care process, a dialogue and a joint understanding. The construction of the model obliged the group to conceptualise the stroke care and experimentation with the model gave the opportunity to reflect on care. Conclusion SD facilitated the collaborative design process and should be integrated in the early stages of the design process as a quality improvement tool. PMID:17683519
Toward a More Flexible Web-Based Framework for Multidisciplinary Design
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Salas, A. O.
1999-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.
ERIC Educational Resources Information Center
Clarke, Katie C.
2010-01-01
A new Science, Engineering and Technology (SET) approach was designed for youth who participated in the Minnesota State Fair Livestock interview process. The project and evaluation were designed to determine if the new SET approach increased content knowledge and science process skills in participants. Results revealed that youth participants not…
Improving product introduction through effective design reviews.
Pelnik, Tammy M
2003-01-01
The design review process is a part of the manufacturer's due diligence in developing a safe and effective product. Design review provides early and on-going independent feedback to developers. By adopting a proactive review process, design improvements can be pursued at an optimum time in the product development effort, i.e., when it will cost less to implement changes and when these changes may have the greatest impact. Effective implementation of the design review requirement will lead to better medical products and improved product introduction results.
Chanona, J; Ribes, J; Seco, A; Ferrer, J
2006-01-01
This paper presents a model-knowledge based algorithm for optimising the primary sludge fermentation process design and operation. This is a recently used method to obtain the volatile fatty acids (VFA), needed to improve biological nutrient removal processes, directly from the raw wastewater. The proposed algorithm consists in a heuristic reasoning algorithm based on the expert knowledge of the process. Only effluent VFA and the sludge blanket height (SBH) have to be set as design criteria, and the optimisation algorithm obtains the minimum return sludge and waste sludge flow rates which fulfil those design criteria. A pilot plant fed with municipal raw wastewater was operated in order to obtain experimental results supporting the developed algorithm groundwork. The experimental results indicate that when SBH was increased, higher solids retention time was obtained in the settler and VFA production increased. Higher recirculation flow-rates resulted in higher VFA production too. Finally, the developed algorithm has been tested by simulating different design conditions with very good results. It has been able to find the optimal operation conditions in all cases on which preset design conditions could be achieved. Furthermore, this is a general algorithm that can be applied to any fermentation-elutriation scheme with or without fermentation reactor.
Developing Engineering and Science Process Skills Using Design Software in an Elementary Education
NASA Astrophysics Data System (ADS)
Fusco, Christopher
This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.
Design and application of process control charting methodologies to gamma irradiation practices
NASA Astrophysics Data System (ADS)
Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.
2002-12-01
The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.
Structural design/margin assessment
NASA Technical Reports Server (NTRS)
Ryan, R. S.
1993-01-01
Determining structural design inputs and the structural margins following design completion is one of the major activities in space exploration. The end result is a statement of these margins as stability, safety factors on ultimate and yield stresses, fracture limits (fracture control), fatigue lifetime, reuse criteria, operational criteria and procedures, stability factors, deflections, clearance, handling criteria, etc. The process is normally called a load cycle and is time consuming, very complex, and involves much more than structures. The key to successful structural design is the proper implementation of the process. It depends on many factors: leadership and management of the process, adequate analysis and testing tools, data basing, communications, people skills, and training. This process and the various factors involved are discussed.
Development of the Upgraded DC Brush Gear Motor for Spacebus Platforms
NASA Technical Reports Server (NTRS)
Berning, Robert H.; Viout, Olivier
2010-01-01
The obsolescence of materials and processes used in the manufacture of traditional DC brush gear motors has necessitated the development of an upgraded DC brush gear motor (UBGM). The current traditional DC brush gear motor (BGM) design was evaluated using Six-Sigma process to identify potential design and production process improvements. The development effort resulted in a qualified UBGM design which improved manufacturability and reduced production costs. Using Six-Sigma processes and incorporating lessons learned during the development process also improved motor performance for UBGM making it a more viable option for future use as a deployment mechanism in space flight applications.
From Intent to Action: An Iterative Engineering Process
ERIC Educational Resources Information Center
Mouton, Patrice; Rodet, Jacques; Vacaresse, Sylvain
2015-01-01
Quite by chance, and over the course of a few haphazard meetings, a Master's degree in "E-learning Design" gradually developed in a Faculty of Economics. Its original and evolving design was the result of an iterative process carried out, not by a single Instructional Designer (ID), but by a full ID team. Over the last 10 years it has…
Designing the accident and emergency system: lessons from manufacturing
Walley, P
2003-01-01
Objectives: To review the literature on manufacturing process design and demonstrate applicability in health care. Methods: Literature review and application of theory using two years activity data from two healthcare communities and extensive observation of activities over a six week period by seven researchers. Results: It was possible to identify patient flows that could be used to design treatment processes around the needs of the patient. Some queues are built into existing treatment processes and can be removed by better process design. Capacity imbalance, not capacity shortage, causes some unnecessary waiting in accident and emergency departments. Conclusions: Clinicians would find that modern manufacturing theories produce more acceptable designs of systems. In particular, good quality is seen as a necessary pre-requisite of fast, efficient services. PMID:12642523
ALARA radiation considerations for the AP600 reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lau, F.L.
1995-03-01
The radiation design of the AP600 reactor plant is based on an average annual occupational radiation exposure (ORE) of 100 man-rem. As a design goal we have established a lower value of 70 man-rem per year. And, with our current design process, we expect to achieve annual exposures which are well below this goal. To accomplish our goal we have established a process that provides criteria, guidelines and customer involvement to achieve the desired result. The criteria and guidelines provide the shield designer, as well as the systems and plant layout designers with information that will lead to an integratedmore » plant design that minimizes personnel exposure and yet is not burdened with complicated shielding or unnecessary component access limitations. Customer involvement is provided in the form of utility input, design reviews and information exchange. Cooperative programs with utilities in the development of specific systems or processes also provides for an ALARA design. The results are features which include ALARA radiation considerations as an integral part of the plant design and a lower plant ORE. It is anticipated that a further reduction in plant personnel exposures will result through good radiological practices by the plant operators. The information in place to support and direct the plant designers includes the Utility Requirements Document (URD), Federal Regulations, ALARA guidelines, radiation design information and radiation and shielding design criteria. This information, along with the utility input, design reviews and information feedback, will contribute to the reduction of plant radiation exposure levels such that they will be less than the stated goals.« less
NASA Astrophysics Data System (ADS)
Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.
2018-03-01
The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.
Design applications for supercomputers
NASA Technical Reports Server (NTRS)
Studerus, C. J.
1987-01-01
The complexity of codes for solutions of real aerodynamic problems has progressed from simple two-dimensional models to three-dimensional inviscid and viscous models. As the algorithms used in the codes increased in accuracy, speed and robustness, the codes were steadily incorporated into standard design processes. The highly sophisticated codes, which provide solutions to the truly complex flows, require computers with large memory and high computational speed. The advent of high-speed supercomputers, such that the solutions of these complex flows become more practical, permits the introduction of the codes into the design system at an earlier stage. The results of several codes which either were already introduced into the design process or are rapidly in the process of becoming so, are presented. The codes fall into the area of turbomachinery aerodynamics and hypersonic propulsion. In the former category, results are presented for three-dimensional inviscid and viscous flows through nozzle and unducted fan bladerows. In the latter category, results are presented for two-dimensional inviscid and viscous flows for hypersonic vehicle forebodies and engine inlets.
Flow Chemistry for Designing Sustainable Chemical Synthesis (journal article)
An efficiently designed continuous flow chemical process can lead to significant advantages in developing a sustainable chemical synthesis or process. These advantages are the direct result of being able to impart a higher degree of control on several key reactor and reaction par...
Martin, Cathrin; H. Opava, Christina; Brusewitz, Maria; Keller, Christina; Åsenlöf, Pernilla
2015-01-01
Background User involvement in the development of health care services is important for the viability, usability, and effectiveness of services. This study reports on the second step of the co-design process. Objective The aim was to explore the significant challenges in advancing the co-design process during the requirements specification phase of a mobile Internet service for the self-management of physical activity (PA) in rheumatoid arthritis (RA). Methods A participatory action research design was used to involve lead users and stakeholders as co-designers. Lead users (n=5), a clinical physiotherapist (n=1), researchers (n=2) with knowledge in PA in RA and behavioral learning theories, an eHealth strategist (n=1), and an officer from the patient organization (n=1) collaborated in 4 workshops. Data-collection methods included video recordings and naturalistic observations. Results The inductive qualitative video-based analysis resulted in 1 overarching theme, merging perspectives, and 2 subthemes reflecting different aspects of merging: (1) finding a common starting point and (2) deciding on design solutions. Seven categories illustrated the specific challenges: reaching shared understanding of goals, clarifying and handling the complexity of participants’ roles, clarifying terminology related to system development, establishing the rationale for features, negotiating features, transforming ideas into concrete features, and participants’ alignment with the agreed goal and task. Conclusions Co-designing the system requirements of a mobile Internet service including multiple stakeholders was a complex and extensive collaborative decision-making process. Considering, valuing, counterbalancing, and integrating different perspectives into agreements and solutions (ie, the merging of participants’ perspectives) were crucial for moving the process forward and were considered the core challenges of co-design. Further research is needed to replicate the results and to increase knowledge on key factors for a successful co-design of health care services. PMID:26381221
Revere, Debra; Dixon, Brian E; Hills, Rebecca; Williams, Jennifer L; Grannis, Shaun J
2014-01-01
Surveillance, or the systematic monitoring of disease within a population, is a cornerstone function of public health. Despite significant investment in information technologies (IT) to improve the public's health, health care providers continue to rely on manual, spontaneous reporting processes that can result in incomplete and delayed surveillance activities. Participatory design principles advocate including real users and stakeholders when designing an information system to ensure high ecological validity of the product, incorporate relevance and context into the design, reduce misconceptions designers can make due to insufficient domain expertise, and ultimately reduce barriers to adoption of the system. This paper focuses on the collaborative and informal participatory design process used to develop enhanced, IT-enabled reporting processes that leverage available electronic health records in a health information exchange to prepopulate notifiable-conditions report forms used by public health authorities. Over nine months, public health stakeholders, technical staff, and informatics researchers were engaged in a multiphase participatory design process that included public health stakeholder focus groups, investigator-engineering team meetings, public health survey and census regarding high-priority data elements, and codesign of exploratory prototypes and final form mock-ups. A number of state-mandated report fields that are not highly used or desirable for disease investigation were eliminated, which allowed engineers to repurpose form space for desired and high-priority data elements and improve the usability of the forms. Our participatory design process ensured that IT development was driven by end user expertise and needs, resulting in significant improvements to the layout and functionality of the reporting forms. In addition to informing report form development, engaging with public health end users and stakeholders through the participatory design process provided new insights into public health workflow and allowed the team to quickly triage user requests while managing user expectations within the realm of engineering possibilities. Engaging public health, engineering staff, and investigators in a shared codesigning process ensured that the new forms will not only meet real-life needs but will also support development of a product that will be adopted and, ultimately, improve communicable and infectious disease reporting by clinicians to public health.
NASA Astrophysics Data System (ADS)
Rautenbach, V.; Coetzee, S.; Çöltekin, A.
2016-06-01
Informal settlements are a common occurrence in South Africa, and to improve in-situ circumstances of communities living in informal settlements, upgrades and urban design processes are necessary. Spatial data and maps are essential throughout these processes to understand the current environment, plan new developments, and communicate the planned developments. All stakeholders need to understand maps to actively participate in the process. However, previous research demonstrated that map literacy was relatively low for many planning professionals in South Africa, which might hinder effective planning. Because 3D visualizations resemble the real environment more than traditional maps, many researchers posited that they would be easier to interpret. Thus, our goal is to investigate the effectiveness of 3D geovisualizations for urban design in informal settlement upgrading in South Africa. We consider all involved processes: 3D modelling, visualization design, and cognitive processes during map reading. We found that procedural modelling is a feasible alternative to time-consuming manual modelling, and can produce high quality models. When investigating the visualization design, the visual characteristics of 3D models and relevance of a subset of visual variables for urban design activities of informal settlement upgrades were qualitatively assessed. The results of three qualitative user experiments contributed to understanding the impact of various levels of complexity in 3D city models and map literacy of future geoinformatics and planning professionals when using 2D maps and 3D models. The research results can assist planners in designing suitable 3D models that can be used throughout all phases of the process.
Jensen, M D; Ingildsen, P; Rasmussen, M R; Laursen, J
2006-01-01
Aeration tank settling is a control method allowing settling in the process tank during high hydraulic load. The control method is patented. Aeration tank settling has been applied in several waste water treatment plants using the present design of the process tanks. Some process tank designs have shown to be more effective than others. To improve the design of less effective plants, computational fluid dynamics (CFD) modelling of hydraulics and sedimentation has been applied. This paper discusses the results at one particular plant experiencing problems with partly short-circuiting of the inlet and outlet causing a disruption of the sludge blanket at the outlet and thereby reducing the retention of sludge in the process tank. The model has allowed us to establish a clear picture of the problems arising at the plant during aeration tank settling. Secondly, several process tank design changes have been suggested and tested by means of computational fluid dynamics modelling. The most promising design changes have been found and reported.
2013-04-01
project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of
Overview of CMOS process and design options for image sensor dedicated to space applications
NASA Astrophysics Data System (ADS)
Martin-Gonthier, P.; Magnan, P.; Corbiere, F.
2005-10-01
With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.
Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda
2002-01-01
A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.
The Waste Reduction Decision Support System (WAR DSS) is a Java-based software product providing comprehensive modeling of potential adverse environmental impacts (PEI) predicted to result from newly designed or redesigned chemical manufacturing processes. The purpose of this so...
A Meta-Analysis and Review of Holistic Face Processing
Richler, Jennifer J.; Gauthier, Isabel
2014-01-01
The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, two different measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the two designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs, and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly three times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the first sections of our review—the complete design—and outline outstanding research questions in that new context. PMID:24956123
Functional Fault Model Development Process to Support Design Analysis and Operational Assessment
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.
2016-01-01
A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.
Ji, Yu; Tian, Yang; Ahnfelt, Mattias; Sui, Lili
2014-06-27
Multivalent pneumococcal vaccines were used worldwide to protect human beings from pneumococcal diseases. In order to eliminate the toxic organic solutions used in the traditional vaccine purification process, an alternative chromatographic process for Streptococcus pneumoniae serotype 23F capsular polysaccharide (CPS) was proposed in this study. The strategy of Design of Experiments (DoE) was introduced into the process development to solve the complicated design procedure. An initial process analysis was given to review the whole flowchart, identify the critical factors of chromatography through FMEA and chose the flowthrough mode due to the property of the feed. A resin screening study was then followed to select candidate resins. DoE was utilized to generate a resolution IV fractional factorial design to further compare candidates and narrow down the design space. After Capto Adhere was selected, the Box-Behnken DoE was executed to model the process and characterize all effects of factors on the responses. Finally, Monte Carlo simulation was used to optimize the process, test the chosen optimal conditions and define the control limit. The results of three scale-up runs at set points verified the DoE and simulation predictions. The final results were well in accordance with the EU pharmacopeia requirements: Protein/CPS (w/w) 1.08%; DNA/CPS (w/w) 0.61%; the phosphorus content 3.1%; the nitrogen 0.315% and the Methyl-pentose percentage 47.9%. Other tests of final pure CPS also met the pharmacopeia specifications. This alternative chromatographic purification process for pneumococcal vaccine without toxic organic solvents was successfully developed by the DoE approach and proved scalability, robustness and suitability for large scale manufacturing. Copyright © 2014 Elsevier B.V. All rights reserved.
A quality by design study applied to an industrial pharmaceutical fluid bed granulation.
Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens
2012-06-01
The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh
2017-03-01
Silicon testing results are regularly collected for a particular lot of wafers to study yield loss from test result diagnostics. Product engineers will analyze the diagnostic results and perform a number of physical failure analyses to detect systematic defects which cause yield loss for these sets of wafers in order to feedback the information to process engineers for process improvements. Most of time, the systematic defects that are detected are major issues or just one of the causes for the overall yield loss. This paper will present a working flow for using design analysis techniques combined with diagnostic methods to systematically transform silicon testing information into physical layout information. A new set of the testing results are received from a new lot of wafers for the same product. We can then correlate all the diagnostic results from different periods of time to check which blocks or nets have been highlighted or stop occurring on the failure reports in order to monitor process changes which impact the yield. The design characteristic analysis flow is also implemented to find 1) the block connections on a design that have failed electrical test or 2) frequently used cells that been highlighted multiple times.
Interactions of double patterning technology with wafer processing, OPC and design flows
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Cork, Chris; Miloslavsky, Alex; Luk-Pat, Gerry; Barnes, Levi; Hapli, John; Lewellen, John; Rollins, Greg; Wiaux, Vincent; Verhaegen, Staf
2008-03-01
Double patterning technology (DPT) is one of the main options for printing logic devices with half-pitch less than 45nm; and flash and DRAM memory devices with half-pitch less than 40nm. DPT methods decompose the original design intent into two individual masking layers which are each patterned using single exposures and existing 193nm lithography tools. The results of the individual patterning layers combine to re-create the design intent pattern on the wafer. In this paper we study interactions of DPT with lithography, masks synthesis and physical design flows. Double exposure and etch patterning steps create complexity for both process and design flows. DPT decomposition is a critical software step which will be performed in physical design and also in mask synthesis. Decomposition includes cutting (splitting) of original design intent polygons into multiple polygons where required; and coloring of the resulting polygons. We evaluate the ability to meet key physical design goals such as: reduce circuit area; minimize rework; ensure DPT compliance; guarantee patterning robustness on individual layer targets; ensure symmetric wafer results; and create uniform wafer density for the individual patterning layers.
Challenges of Engaging Local Stakeholders for Statewide Program Development Process
ERIC Educational Resources Information Center
Martin, Michael J.; Leuci, Mary; Stewart, Mark
2014-01-01
The University of Missouri Extension needed to develop an annual program review process that collaboratively engaged county-level stakeholders. The results from the first 2 years highlight the results, challenges, and implications of the design process. The annual review process needs to be adaptive, responsive, and reflective from year to year…
Lefkoff, L.J.; Gorelick, S.M.
1986-01-01
Detailed two-dimensional flow simulation of a complex ground-water system is combined with quadratic and linear programming to evaluate design alternatives for rapid aquifer restoration. Results show how treatment and pumping costs depend dynamically on the type of treatment process, and capacity of pumping and injection wells, and the number of wells. The design for an inexpensive treatment process minimizes pumping costs, while an expensive process results in the minimization of treatment costs. Substantial reductions in pumping costs occur with increases in injection capacity or in the number of wells. Treatment costs are reduced by expansions in pumping capacity or injecion capacity. The analysis identifies maximum pumping and injection capacities.-from Authors
NASA Technical Reports Server (NTRS)
Sauerwein, Timothy
1989-01-01
The human factors design process in developing a shuttle orbiter aft flight deck workstation testbed is described. In developing an operator workstation to control various laboratory telerobots, strong elements of human factors engineering and ergonomics are integrated into the design process. The integration of human factors is performed by incorporating user feedback at key stages in the project life-cycle. An operator centered design approach helps insure the system users are working with the system designer in the design and operation of the system. The design methodology is presented along with the results of the design and the solutions regarding human factors design principles.
Creative user-centered visualization design for energy analysts and modelers.
Goodwin, Sarah; Dykes, Jason; Jones, Sara; Dillingham, Iain; Dove, Graham; Duffy, Alison; Kachkaev, Alexander; Slingsby, Aidan; Wood, Jo
2013-12-01
We enhance a user-centered design process with techniques that deliberately promote creativity to identify opportunities for the visualization of data generated by a major energy supplier. Visualization prototypes developed in this way prove effective in a situation whereby data sets are largely unknown and requirements open - enabling successful exploration of possibilities for visualization in Smart Home data analysis. The process gives rise to novel designs and design metaphors including data sculpting. It suggests: that the deliberate use of creativity techniques with data stakeholders is likely to contribute to successful, novel and effective solutions; that being explicit about creativity may contribute to designers developing creative solutions; that using creativity techniques early in the design process may result in a creative approach persisting throughout the process. The work constitutes the first systematic visualization design for a data rich source that will be increasingly important to energy suppliers and consumers as Smart Meter technology is widely deployed. It is novel in explicitly employing creativity techniques at the requirements stage of visualization design and development, paving the way for further use and study of creativity methods in visualization design.
Research in software allocation for advanced manned mission communications and tracking systems
NASA Technical Reports Server (NTRS)
Warnagiris, Tom; Wolff, Bill; Kusmanoff, Antone
1990-01-01
An assessment of the planned processing hardware and software/firmware for the Communications and Tracking System of the Space Station Freedom (SSF) was performed. The intent of the assessment was to determine the optimum distribution of software/firmware in the processing hardware for maximum throughput with minimum required memory. As a product of the assessment process an assessment methodology was to be developed that could be used for similar assessments of future manned spacecraft system designs. The assessment process was hampered by changing requirements for the Space Station. As a result, the initial objective of determining the optimum software/firmware allocation was not fulfilled, but several useful conclusions and recommendations resulted from the assessment. It was concluded that the assessment process would not be completely successful for a system with changing requirements. It was also concluded that memory requirements and hardware requirements were being modified to fit as a consequence of the change process, and although throughput could not be quantitized, potential problem areas could be identified. Finally, inherent flexibility of the system design was essential for the success of a system design with changing requirements. Recommendations resulting from the assessment included development of common software for some embedded controller functions, reduction of embedded processor requirements by hardwiring some Orbital Replacement Units (ORUs) to make better use of processor capabilities, and improvement in communications between software development personnel to enhance the integration process. Lastly, a critical observation was made regarding the software integration tasks did not appear to be addressed in the design process to the degree necessary for successful satisfaction of the system requirements.
Lessons learned: design, start-up, and operation of cryogenic systems
NASA Astrophysics Data System (ADS)
Bell, W. M.; Bagley, R. E.; Motew, S.; Young, P.-W.
2014-11-01
Cryogenic systems involving a pumped cryogenic fluid, such as liquid nitrogen (LN2), require careful design since the cryogen is close to its boiling point and cold. At 1 atmosphere, LN2 boils at 77.4 K (-320.4 F). These systems, typically, are designed to transport the cryogen, use it for process heat removal, or for generation of gas (GN2) for process use. As the design progresses, it is important to consider all aspects of the design including, cryogen storage, pressure control and safety relief systems, thermodynamic conditions, equipment and instrument selection, materials, insulation, cooldown, pump start-up, maximum design and minimum flow rates, two phase flow conditions, heat flow, process control to meet and maintain operating conditions, piping integrity, piping loads on served equipment, warm-up, venting, and shut-down. "Cutting corners" in the design process can result in stalled start-ups, field rework, schedule hits, or operational restrictions. Some of these "lessoned learned" are described in this paper.
NASA Astrophysics Data System (ADS)
Baroroh, D. K.; Alfiah, D.
2018-05-01
The electric vehicle is one of the innovations to reduce the pollution of the vehicle. Nevertheless, it still has a problem, especially for disposal stage. In supporting product design and development strategy, which is the idea of sustainable design or problem solving of disposal stage, assessment of modularity architecture from electric vehicle in recovery process needs to be done. This research used Design Structure Matrix (DSM) approach to deciding interaction of components and assessment of modularity architecture using the calculation of value from 3 variables, namely Module Independence (MI), Module Similarity (MS), and Modularity for End of Life Stage (MEOL). The result of this research shows that existing design of electric vehicles has the architectural design which has a high value of modularity for recovery process on disposal stage. Accordingly, so it can be reused and recycled in component level or module without disassembly process to support the product that is environmentally friendly (sustainable design) and able reduce disassembly cost.
Optimal cost design of water distribution networks using a decomposition approach
NASA Astrophysics Data System (ADS)
Lee, Ho Min; Yoo, Do Guen; Sadollah, Ali; Kim, Joong Hoon
2016-12-01
Water distribution network decomposition, which is an engineering approach, is adopted to increase the efficiency of obtaining the optimal cost design of a water distribution network using an optimization algorithm. This study applied the source tracing tool in EPANET, which is a hydraulic and water quality analysis model, to the decomposition of a network to improve the efficiency of the optimal design process. The proposed approach was tested by carrying out the optimal cost design of two water distribution networks, and the results were compared with other optimal cost designs derived from previously proposed optimization algorithms. The proposed decomposition approach using the source tracing technique enables the efficient decomposition of an actual large-scale network, and the results can be combined with the optimal cost design process using an optimization algorithm. This proves that the final design in this study is better than those obtained with other previously proposed optimization algorithms.
Investigation of Low-Reynolds-Number Rocket Nozzle Design Using PNS-Based Optimization Procedure
NASA Technical Reports Server (NTRS)
Hussaini, M. Moin; Korte, John J.
1996-01-01
An optimization approach to rocket nozzle design, based on computational fluid dynamics (CFD) methodology, is investigated for low-Reynolds-number cases. This study is undertaken to determine the benefits of this approach over those of classical design processes such as Rao's method. A CFD-based optimization procedure, using the parabolized Navier-Stokes (PNS) equations, is used to design conical and contoured axisymmetric nozzles. The advantage of this procedure is that it accounts for viscosity during the design process; other processes make an approximated boundary-layer correction after an inviscid design is created. Results showed significant improvement in the nozzle thrust coefficient over that of the baseline case; however, the unusual nozzle design necessitates further investigation of the accuracy of the PNS equations for modeling expanding flows with thick laminar boundary layers.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
Design Process Improvement for Electric CAR Harness
NASA Astrophysics Data System (ADS)
Sawatdee, Thiwarat; Chutima, Parames
2017-06-01
In an automobile parts design company, the customer satisfaction is one of the most important factors for product design. Therefore, the company employs all means to focus its product design process based on the various requirements of customers resulting in high number of design changes. The objective of this research is to improve the design process of the electric car harness that effects the production scheduling by using Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) as the main tools. FTA is employed for root cause analysis and FMEA is used to ranking a High Risk Priority Number (RPN) which is shows the priority of factors in the electric car harness that have high impact to the design of the electric car harness. After the implementation, the improvements are realized significantly since the number of design change is reduced from 0.26% to 0.08%.
NASA Astrophysics Data System (ADS)
Beardsley, Sara; Stochetti, Alejandro; Cerone, Marc
2018-03-01
Akhmat Tower is a 435m supertall building designed by Adrian Smith + Gordon Gill Architecture. It is currently under construction in the city of Grozny, in the Chechen Republic, in the North Caucasus region of Russia. The design of the tower was done during a collaborative process by a multi-disciplinary architectural and engineering team, based primarily in the United States and Russia. During this process, the designers considered many factors including, most primarily, the cultural and historical context, the structural requirements given the high seismicity of the region, and the client's programmatic needs. The resulting crystalline-shaped tower is both an aesthetic statement and a performative architectural solution which will be a new landmark for Chechnya. "The Design of Akhmat Tower" describes in detail the design process including structural considerations, exterior wall design, building program, interior design, the tuned mass damper, and the use of building information modeling.
Dellal, George; Peterson, Laura E; Provost, Lloyd; Gloor, Peter A; Fore, David Livingstone; Margolis, Peter A
2018-01-01
Background Our health care system fails to deliver necessary results, and incremental system improvements will not deliver needed change. Learning health systems (LHSs) are seen as a means to accelerate outcomes, improve care delivery, and further clinical research; yet, few such systems exist. We describe the process of codesigning, with all relevant stakeholders, an approach for creating a collaborative chronic care network (C3N), a peer-produced networked LHS. Objective The objective of this study was to report the methods used, with a diverse group of stakeholders, to translate the idea of a C3N to a set of actionable next steps. Methods The setting was ImproveCareNow, an improvement network for pediatric inflammatory bowel disease. In collaboration with patients and families, clinicians, researchers, social scientists, technologists, and designers, C3N leaders used a modified idealized design process to develop a design for a C3N. Results Over 100 people participated in the design process that resulted in (1) an overall concept design for the ImproveCareNow C3N, (2) a logic model for bringing about this system, and (3) 13 potential innovations likely to increase awareness and agency, make it easier to collect and share information, and to enhance collaboration that could be tested collectively to bring about the C3N. Conclusions We demonstrate methods that resulted in a design that has the potential to transform the chronic care system into an LHS. PMID:29472173
Information theoretical assessment of visual communication with subband coding
NASA Astrophysics Data System (ADS)
Rahman, Zia-ur; Fales, Carl L.; Huck, Friedrich O.
1994-09-01
A well-designed visual communication channel is one which transmits the most information about a radiance field with the fewest artifacts. The role of image processing, encoding and restoration is to improve the quality of visual communication channels by minimizing the error in the transmitted data. Conventionally this role has been analyzed strictly in the digital domain neglecting the effects of image-gathering and image-display devices on the quality of the image. This results in the design of a visual communication channel which is `suboptimal.' We propose an end-to-end assessment of the imaging process which incorporates the influences of these devices in the design of the encoder and the restoration process. This assessment combines Shannon's communication theory with Wiener's restoration filter and with the critical design factors of the image gathering and display devices, thus providing the metrics needed to quantify and optimize the end-to-end performance of the visual communication channel. Results show that the design of the image-gathering device plays a significant role in determining the quality of the visual communication channel and in designing the analysis filters for subband encoding.
Methodological Reflections: Designing and Understanding Computer-Supported Collaborative Learning
ERIC Educational Resources Information Center
Hamalainen, Raija
2012-01-01
Learning involves more than just a small group of participants, which makes designing and managing collaborative learning processes in higher education a challenging task. As a result, emerging concerns in current research have pointed increasingly to teacher orchestrated learning processes in naturalistic learning settings. In line with this…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lah, J; Shin, D; Kim, G
Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary tomore » meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.« less
Optimization of MLS receivers for multipath environments
NASA Technical Reports Server (NTRS)
Mcalpine, G. A.; Highfill, J. H., III
1976-01-01
The design of a microwave landing system (MLS) aircraft receiver, capable of optimal performance in multipath environments found in air terminal areas, is reported. Special attention was given to the angle tracking problem of the receiver and includes tracking system design considerations, study and application of locally optimum estimation involving multipath adaptive reception and then envelope processing, and microcomputer system design. Results show processing is competitive in this application with i-f signal processing performance-wise and is much more simple and cheaper. A summary of the signal model is given.
A meta-analysis and review of holistic face processing.
Richler, Jennifer J; Gauthier, Isabel
2014-09-01
The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.
Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo
2016-07-01
During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process). Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Shin-Hyung; Ruy, Won-Sun; Jang, Beom Seon
2013-09-01
An automatic pipe routing system is proposed and implemented. Generally, the pipe routing design as a part of the shipbuilding process requires a considerable number of man hours due to the complexity which comes from physical and operational constraints and the crucial influence on outfitting construction productivity. Therefore, the automation of pipe routing design operations and processes has always been one of the most important goals for improvements in shipbuilding design. The proposed system is applied to a pipe routing design in the engine room space of a commercial ship. The effectiveness of this system is verified as a reasonable form of support for pipe routing design jobs. The automatic routing result of this system can serve as a good basis model in the initial stages of pipe routing design, allowing the designer to reduce their design lead time significantly. As a result, the design productivity overall can be improved with this automatic pipe routing system
Discovery informatics in biological and biomedical sciences: research challenges and opportunities.
Honavar, Vasant
2015-01-01
New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).
ERIC Educational Resources Information Center
Ertmer, Peggy A.; York, Cindy S.; Gedik, Nuray
2009-01-01
Understanding how experienced designers approach complex design problems provides new perspectives on how they translate instructional design (ID) models and processes into practice. In this article, the authors describe the results of a study in which 16 "seasoned" designers shared compelling stories from practice that offered insights into their…
Vu, Lien T; Chen, Chao-Chang A; Lee, Chia-Cheng; Yu, Chia-Wei
2018-04-20
This study aims to develop a compensating method to minimize the shrinkage error of the shell mold (SM) in the injection molding (IM) process to obtain uniform optical power in the central optical zone of soft axial symmetric multifocal contact lenses (CL). The Z-shrinkage error along the Z axis or axial axis of the anterior SM corresponding to the anterior surface of a dry contact lens in the IM process can be minimized by optimizing IM process parameters and then by compensating for additional (Add) powers in the central zone of the original lens design. First, the shrinkage error is minimized by optimizing three levels of four IM parameters, including mold temperature, injection velocity, packing pressure, and cooling time in 18 IM simulations based on an orthogonal array L 18 (2 1 ×3 4 ). Then, based on the Z-shrinkage error from IM simulation, three new contact lens designs are obtained by increasing the Add power in the central zone of the original multifocal CL design to compensate for the optical power errors. Results obtained from IM process simulations and the optical simulations show that the new CL design with 0.1 D increasing in Add power has the closest shrinkage profile to the original anterior SM profile with percentage of reduction in absolute Z-shrinkage error of 55% and more uniform power in the central zone than in the other two cases. Moreover, actual experiments of IM of SM for casting soft multifocal CLs have been performed. The final product of wet CLs has been completed for the original design and the new design. Results of the optical performance have verified the improvement of the compensated design of CLs. The feasibility of this compensating method has been proven based on the measurement results of the produced soft multifocal CLs of the new design. Results of this study can be further applied to predict or compensate for the total optical power errors of the soft multifocal CLs.
Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2016-06-01
Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comtois, J.H.; Michalicek, A.; Barron, C.C.
1997-11-01
This paper presents the results of tests performed on a variety of electrochemical microactuators and arrays of these actuators fabricated in the SUMMiT process at the U.S. Department of Energy`s Sandia National Laboratories. These results are intended to aid designers of thermally actuated mechanisms, and they apply to similar actuators made in other polysilicon MEMS processes such as the MUMPS process. Measurements include force and deflection versus input power, maximum operating frequency, effects of long term operation, and ideal actuator and array geometries for different applications` force requirements. Also, different methods of arraying these actuators together are compared. It ismore » found that a method using rotary joints, enabled by the advanced features of the SUMMiT fabrication process, is the most efficient array design. The design and operation of a thermally actuated stepper motor is explained to illustrate a useful application of these arrays.« less
Image-plane processing of visual information
NASA Technical Reports Server (NTRS)
Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.
1984-01-01
Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.
NASA Astrophysics Data System (ADS)
Kehoe, S.; Stokes, J.
2011-03-01
Physicochemical properties of hydroxyapatite (HAp) synthesized by the chemical precipitation method are heavily dependent on the chosen process parameters. A Box-Behnken three-level experimental design was therefore, chosen to determine the optimum set of process parameters and their effect on various HAp characteristics. These effects were quantified using design of experiments (DoE) to develop mathematical models using the Box-Behnken design, in terms of the chemical precipitation process parameters. Findings from this research show that the HAp possessing optimum powder characteristics for orthopedic application via a thermal spray technique can therefore be prepared using the following chemical precipitation process parameters: reaction temperature 60 °C, ripening time 48 h, and stirring speed 1500 rpm using high reagent concentrations. Ripening time and stirring speed significantly affected the final phase purity for the experimental conditions of the Box-Behnken design. An increase in both the ripening time (36-48 h) and stirring speed (1200-1500 rpm) was found to result in an increase of phase purity from 47(±2)% to 85(±2)%. Crystallinity, crystallite size, lattice parameters, and mean particle size were also optimized within the research to find desired settings to achieve results suitable for FDA regulations.
Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang
2017-03-01
In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.
Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider
NASA Astrophysics Data System (ADS)
Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.
2010-03-01
In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
NASA Astrophysics Data System (ADS)
Nuh, M. Z.; Nasir, N. F.
2017-08-01
Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.
The Specific Features of design and process engineering in branch of industrial enterprise
NASA Astrophysics Data System (ADS)
Sosedko, V. V.; Yanishevskaya, A. G.
2017-06-01
Production output of industrial enterprise is organized in debugged working mechanisms at each stage of product’s life cycle from initial design documentation to product and finishing it with utilization. The topic of article is mathematical model of the system design and process engineering in branch of the industrial enterprise, statistical processing of estimated implementation results of developed mathematical model in branch, and demonstration of advantages at application at this enterprise. During the creation of model a data flow about driving of information, orders, details and modules in branch of enterprise groups of divisions were classified. Proceeding from the analysis of divisions activity, a data flow, details and documents the state graph of design and process engineering was constructed, transitions were described and coefficients are appropriated. To each condition of system of the constructed state graph the corresponding limiting state probabilities were defined, and also Kolmogorov’s equations are worked out. When integration of sets of equations of Kolmogorov the state probability of system activity the specified divisions and production as function of time in each instant is defined. On the basis of developed mathematical model of uniform system of designing and process engineering and manufacture, and a state graph by authors statistical processing the application of mathematical model results was carried out, and also advantage at application at this enterprise is shown. Researches on studying of loading services probability of branch and third-party contractors (the orders received from branch within a month) were conducted. The developed mathematical model of system design and process engineering and manufacture can be applied to definition of activity state probability of divisions and manufacture as function of time in each instant that will allow to keep account of loading of performance of work in branches of the enterprise.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
Optimized and Automated design of Plasma Diagnostics for Additive Manufacture
NASA Astrophysics Data System (ADS)
Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon
2016-10-01
Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.
Multi-Criteria Approach in Multifunctional Building Design Process
NASA Astrophysics Data System (ADS)
Gerigk, Mateusz
2017-10-01
The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
Optimization of a Lunar Pallet Lander Reinforcement Structure Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Burt, Adam O.; Hull, Patrick V.
2014-01-01
This paper presents a design automation process using optimization via a genetic algorithm to design the conceptual structure of a Lunar Pallet Lander. The goal is to determine a design that will have the primary natural frequencies at or above a target value as well as minimize the total mass. Several iterations of the process are presented. First, a concept optimization is performed to determine what class of structure would produce suitable candidate designs. From this a stiffened sheet metal approach was selected leading to optimization of beam placement through generating a two-dimensional mesh and varying the physical location of reinforcing beams. Finally, the design space is reformulated as a binary problem using 1-dimensional beam elements to truncate the design space to allow faster convergence and additional mechanical failure criteria to be included in the optimization responses. Results are presented for each design space configuration. The final flight design was derived from these results.
NASA Technical Reports Server (NTRS)
1979-01-01
The functions performed by the systems management (SM) application software are described along with the design employed to accomplish these functions. The operational sequences (OPS) control segments and the cyclic processes they control are defined. The SM specialist function control (SPEC) segments and the display controlled 'on-demand' processes that are invoked by either an OPS or SPEC control segment as a direct result of an item entry to a display are included. Each processing element in the SM application is described including an input/output table and a structured control flow diagram. The flow through the module and other information pertinent to that process and its interfaces to other processes are included.
New Design Heaters Using Tubes Finned by Deforming Cutting Method
NASA Astrophysics Data System (ADS)
Zubkov, N. N.; Nikitenko, S. M.; Nikitenko, M. S.
2017-10-01
The article describes the results of research aimed at selecting and assigning technological processing parameters for obtaining outer fins of heat-exchange tubes by the deformational cutting method, for use in a new design of industrial water-air heaters. The thermohydraulic results of comparative engineering tests of new and standard design air-heaters are presented.
Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.
The methodology of database design in organization management systems
NASA Astrophysics Data System (ADS)
Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.
2017-01-01
The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.
Prediction of Cutting Force in Turning Process-an Experimental Approach
NASA Astrophysics Data System (ADS)
Thangarasu, S. K.; Shankar, S.; Thomas, A. Tony; Sridhar, G.
2018-02-01
This Paper deals with a prediction of Cutting forces in a turning process. The turning process with advanced cutting tool has a several advantages over grinding such as short cycle time, process flexibility, compatible surface roughness, high material removal rate and less environment problems without the use of cutting fluid. In this a full bridge dynamometer has been used to measure the cutting forces over mild steel work piece and cemented carbide insert tool for different combination of cutting speed, feed rate and depth of cut. The experiments are planned based on taguchi design and measured cutting forces were compared with the predicted forces in order to validate the feasibility of the proposed design. The percentage contribution of each process parameter had been analyzed using Analysis of Variance (ANOVA). Both the experimental results taken from the lathe tool dynamometer and the designed full bridge dynamometer were analyzed using Taguchi design of experiment and Analysis of Variance.
School Climate of Educational Institutions: Design and Validation of a Diagnostic Scale
ERIC Educational Resources Information Center
Becerra, Sandra
2016-01-01
School climate is recognized as a relevant factor for the improvement of educative processes, favoring the administrative processes and optimum school performance. The present article is the result of a quantitative research model which had the objective of psychometrically designing and validating a scale to diagnose the organizational climate of…
ERIC Educational Resources Information Center
Bazler, Judith A.; Van Sickle, Meta; Simonis, Doris; Graybill, Letty; Sorenson, Nancy; Brounstein, Erica
2014-01-01
This paper reflects upon the development, design, and results of a questionnaire distributed to professors of science education concerning the processes involved in a national accreditation of teacher education programs in science. After a pilot study, five professors/administrators from public and private institutions designed a questionnaire and…
Effects of Thinking Style on Design Strategies: Using Bridge Construction Simulation Programs
ERIC Educational Resources Information Center
Sun, Chuen-Tsai; Wang, Dai-Yi; Chang, Yu-Yeh
2013-01-01
Computer simulation users can freely control operational factors and simulation results, repeat processes, make changes, and learn from simulation environment feedback. The focus of this paper is on simulation-based design tools and their effects on student learning processes in a group of 101 Taiwanese senior high school students. Participants…
The Study Review Process. WWC Process Brief
ERIC Educational Resources Information Center
What Works Clearinghouse, 2017
2017-01-01
Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal inferences about the effects of an intervention but others have designs that do not permit these types of conclusions. To help policymakers, practitioners, and others make sense of study results, the…
Landscape Design Process of Lakewood Nava Park BSD City Based on Smart Growth Concept
NASA Astrophysics Data System (ADS)
Islami, M. Z.; Kaswanto, R. L.
2017-10-01
A comfortable and green housing area in a city is a must for the people live in a city. The rapid development in a city caused greater need for land. This problem happens simultaneously with environmental problem globally such as growing number of people, pollution, excessive exploitation of resource, and decreasing in ethic of land uses. The design of Lakewood Nava Park BSD City prioritizes on pedestrian and walkable environment to apprehend those problems. Lakewood Nava Park is a landscape design project conducted by landscape consultant company, Sheils Flynn Asia. The concept of Smart Growth used as a recommendation for Lakewood Nava Park design. Smart Growth is a city planning and transportation theory which expand a city into a walkable city. The method used on this research is a comparison between landscape design process and Booth theory, also analyze ten principle concept of Smart Growth at the project. Generally, the comparison between design process and Booth theory resulted a slight difference in term and separate phase. The analysis result from Smart Growth concept is around 70% has been applied, and the rest 30% applied after the design has been built. By using Smart Growth principle, the purpose of Lakewood Nava Park design can be applied well.
Fundamental Fractal Antenna Design Process
NASA Astrophysics Data System (ADS)
Zhu, L. P.; Kim, T. C.; Kakas, G. D.
2017-12-01
Antenna designers are always looking to come up with new ideas to push the envelope for new antennas, using a smaller volume while striving for higher bandwidth, wider bandwidth, and antenna gain. One proposed method of increasing bandwidth or shrinking antenna size is via the use of fractal geometry, which gives rise to fractal antennas. Fractals are those fun shapes that if one zooms in or zoom out, the structure is always the same. Design a new type of antenna based on fractal antenna design by utilize the Design of Experiment (DOE) will be shown in fractal antenna design process. Investigate conformal fractal antenna design for patterns, dimensions, and size, of the antenna but maintaining or improving the antenna performance. Research shows an antenna designer how to create basic requirements of the fractal antenna through a step by step process, and provides how to optimize the antenna design with the model prediction, lab measurement, and actual results from the compact range measurement on the antenna patterns.
1989-11-01
other design tools. RESULTS OF TEST/DEMONSTRATION: Training for the Design 4D Program was conducted at USACERL. Although nearly half of the test...subjects had difficulty with the prompts, their understanding of the program improved after experimenting with the commands. After training , most felt...Equipment Testing Process 3 TEST DISTRICT TRAINING ........................................... 10 Training Process Post Training Survey Post Training
The initial design of LAPAN's IR micro bolometer using mission analysis process
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; M. T., Andi; Firman, B.
2016-11-01
As new player in Infra Red (IR) sector, uncooled, small, and lightweight IR Micro Bolometer has been chosen as one of payloads for LAPAN's next micro satellite project. Driven the desire to create our own IR Micro Bolometer, mission analysis design procedure has been applied. After tracing all possible missions, the Planck's and Wien's Law for black body, Temperature Responsivity (TR), and sub-pixel response had been utilized in order to determine the appropriate spectral radiance. The 3.8 - 4 μm wavelength were available to detect wild fire (forest fire) and active volcanoes, two major problems faced by Indonesia. In order to strengthen and broaden the result, iteration process had been used throughout the process. The analysis, then, were continued by calculating Ground pixel size, IFOV pixel, swath width, and focus length. Meanwhile, regarding of resolution, at least it is 400 m. The further procedure covered the integrated of optical design, wherein we combined among optical design software, Zemax, with mechanical analysis software (structure and thermal analysis), such as Nastran and Thermal Desktop / Sinda Fluint. The integration process was intended to produce high performance optical system of our IR Micro Bolometer that can be used under extreme environment. The results of all those analysis, either in graphs or in measurement, show that the initial design of LAPAN'S IR Micro Bolometer meets the determined requirement. However, it needs the further evaluation (iteration). This paper describes the initial design of LAPAN's IR Micro Bolometer using mission analysis process
The effect of requirements prioritization on avionics system conceptual design
NASA Astrophysics Data System (ADS)
Lorentz, John
This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.
Mechanical Design of a Performance Test Rig for the Turbine Air-Flow Task (TAFT)
NASA Technical Reports Server (NTRS)
Xenofos, George; Forbes, John; Farrow, John; Williams, Robert; Tyler, Tom; Sargent, Scott; Moharos, Jozsef
2003-01-01
To support development of the Boeing-Rocketdyne RS84 rocket engine, a fill-flow, reaction turbine geometry was integrated into the NASA-MSFC turbine air-flow test facility. A mechanical design was generated which minimized the amount of new hardware while incorporating all test and instrUmentation requirements. This paper provides details of the mechanical design for this Turbine Air-Flow Task (TAFT) test rig. The mechanical design process utilized for this task included the following basic stages: Conceptual Design. Preliminary Design. Detailed Design. Baseline of Design (including Configuration Control and Drawing Revision). Fabrication. Assembly. During the design process, many lessons were learned that should benefit future test rig design projects. Of primary importance are well-defined requirements early in the design process, a thorough detailed design package, and effective communication with both the customer and the fabrication contractors. The test rig provided steady and unsteady pressure data necessary to validate the computational fluid dynamics (CFD) code. The rig also helped characterize the turbine blade loading conditions. Test and CFD analysis results are to be presented in another JANNAF paper.
An open-loop system design for deep space signal processing applications
NASA Astrophysics Data System (ADS)
Tang, Jifei; Xia, Lanhua; Mahapatra, Rabi
2018-06-01
A novel open-loop system design with high performance is proposed for space positioning and navigation signal processing. Divided by functions, the system has four modules, bandwidth selectable data recorder, narrowband signal analyzer, time-delay difference of arrival estimator and ANFIS supplement processor. A hardware-software co-design approach is made to accelerate computing capability and improve system efficiency. Embedded with the proposed signal processing algorithms, the designed system is capable of handling tasks with high accuracy over long period of continuous measurements. The experiment results show the Doppler frequency tracking root mean square error during 3 h observation is 0.0128 Hz, while the TDOA residue analysis in correlation power spectrum is 0.1166 rad.
NASA Technical Reports Server (NTRS)
1981-01-01
Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.
Nagashima, Hiroaki; Watari, Akiko; Shinoda, Yasuharu; Okamoto, Hiroshi; Takuma, Shinya
2013-12-01
This case study describes the application of Quality by Design elements to the process of culturing Chinese hamster ovary cells in the production of a monoclonal antibody. All steps in the cell culture process and all process parameters in each step were identified by using a cause-and-effect diagram. Prospective risk assessment using failure mode and effects analysis identified the following four potential critical process parameters in the production culture step: initial viable cell density, culture duration, pH, and temperature. These parameters and lot-to-lot variability in raw material were then evaluated by process characterization utilizing a design of experiments approach consisting of a face-centered central composite design integrated with a full factorial design. Process characterization was conducted using a scaled down model that had been qualified by comparison with large-scale production data. Multivariate regression analysis was used to establish statistical prediction models for performance indicators and quality attributes; with these, we constructed contour plots and conducted Monte Carlo simulation to clarify the design space. The statistical analyses, especially for raw materials, identified set point values, which were most robust with respect to the lot-to-lot variability of raw materials while keeping the product quality within the acceptance criteria. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel
NASA Astrophysics Data System (ADS)
Xie, Yanmin
2011-08-01
Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.
NASA Technical Reports Server (NTRS)
Hall, Laverne
1995-01-01
Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.
Revere, Debra; Dixon, Brian E.; Hills, Rebecca; Williams, Jennifer L.; Grannis, Shaun J.
2014-01-01
Introduction: Surveillance, or the systematic monitoring of disease within a population, is a cornerstone function of public health. Despite significant investment in information technologies (IT) to improve the public’s health, health care providers continue to rely on manual, spontaneous reporting processes that can result in incomplete and delayed surveillance activities. Background: Participatory design principles advocate including real users and stakeholders when designing an information system to ensure high ecological validity of the product, incorporate relevance and context into the design, reduce misconceptions designers can make due to insufficient domain expertise, and ultimately reduce barriers to adoption of the system. This paper focuses on the collaborative and informal participatory design process used to develop enhanced, IT-enabled reporting processes that leverage available electronic health records in a health information exchange to prepopulate notifiable-conditions report forms used by public health authorities. Methods: Over nine months, public health stakeholders, technical staff, and informatics researchers were engaged in a multiphase participatory design process that included public health stakeholder focus groups, investigator-engineering team meetings, public health survey and census regarding high-priority data elements, and codesign of exploratory prototypes and final form mock-ups. Findings: A number of state-mandated report fields that are not highly used or desirable for disease investigation were eliminated, which allowed engineers to repurpose form space for desired and high-priority data elements and improve the usability of the forms. Our participatory design process ensured that IT development was driven by end user expertise and needs, resulting in significant improvements to the layout and functionality of the reporting forms. Discussion: In addition to informing report form development, engaging with public health end users and stakeholders through the participatory design process provided new insights into public health workflow and allowed the team to quickly triage user requests while managing user expectations within the realm of engineering possibilities. Conclusion: Engaging public health, engineering staff, and investigators in a shared codesigning process ensured that the new forms will not only meet real-life needs but will also support development of a product that will be adopted and, ultimately, improve communicable and infectious disease reporting by clinicians to public health. PMID:25848615
Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.
Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel
2017-03-17
Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Research on animation design of growing plant based on 3D MAX technology
NASA Astrophysics Data System (ADS)
Chen, Yineng; Fang, Kui; Bu, Weiqiong; Zhang, Xiaoling; Lei, Menglong
In view of virtual plant has practical demands on quality, image and degree of realism animation in growing process of plant, this thesis design the animation based on mechanism and regularity of plant growth, and propose the design method based on 3D MAX technology. After repeated analysis and testing, it is concluded that there are modeling, rendering, animation fabrication and other key technologies in the animation design process. Based on this, designers can subdivid the animation into seed germination animation, plant growth prophase animation, catagen animation, later animation and blossom animation. This paper compounds the animation of these five stages by VP window to realize the completed 3D animation. Experimental result shows that the animation can realized rapid, visual and realistic simulatation the plant growth process.
Process wastewater treatability study for Westinghouse fluidized-bed coal gasification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winton, S.L.; Buvinger, B.J.; Evans, J.M.
1983-11-01
In the development of a synthetic fuels facility, water usage and wastewater treatment are major areas of concern. Coal gasification processes generally produce relatively large volumes of gas condensates. These wastewaters are typically composed of a variety of suspended and dissolved organic and inorganic solids and dissolved gaseous contaminants. Fluidized-bed coal gasification (FBG) processes are no exception to this rule. The Department of Energy's Morgantown Energy Technology Center (METC), the Gas Research Institute (GRI), and the Environmental Protection Agency (EPA/IERLRTP) recognized the need for a FBG treatment program to provide process design data for FBG wastewaters during the environmental, health,more » and safety characterization of the Westinghouse Process Development Unit (PDU). In response to this need, METC developed conceptual designs and a program plan to obtain process design and performance data for treating wastewater from commercial-scale Westinghouse-based synfuels plants. As a result of this plan, METC, GRI, and EPA entered into a joint program to develop performance data, design parameters, conceptual designs, and cost estimates for treating wastewaters from a FBG plant. Wastewater from the Westinghouse PDU consists of process quench and gas cooling condensates which are similar to those produced by other FBG processes such as U-Gas, and entrained-bed gasification processes such as Texaco. Therefore, wastewater from this facility was selected as the basis for this study. This paper outlines the current program for developing process design and cost data for the treatment of these wastewaters.« less
NASA Astrophysics Data System (ADS)
Ciminelli, Caterina; Dell'Olio, Francesco; Armenise, Mario N.; Iacomacci, Francesco; Pasquali, Franca; Formaro, Roberto
2017-11-01
A fiber optic digital link for on-board data handling is modeled, designed and optimized in this paper. Design requirements and constraints relevant to the link, which is in the frame of novel on-board processing architectures, are discussed. Two possible link configurations are investigated, showing their advantages and disadvantages. An accurate mathematical model of each link component and the entire system is reported and results of link simulation based on those models are presented. Finally, some details on the optimized design are provided.
Design optimization of aircraft landing gear assembly under dynamic loading
NASA Astrophysics Data System (ADS)
Wong, Jonathan Y. B.
As development cycles and prototyping iterations begin to decrease in the aerospace industry, it is important to develop and improve practical methodologies to meet all design metrics. This research presents an efficient methodology that applies high-fidelity multi-disciplinary design optimization techniques to commercial landing gear assemblies, for weight reduction, cost savings, and structural performance dynamic loading. Specifically, a slave link subassembly was selected as the candidate to explore the feasibility of this methodology. The design optimization process utilized in this research was sectioned into three main stages: setup, optimization, and redesign. The first stage involved the creation and characterization of the models used throughout this research. The slave link assembly was modelled with a simplified landing gear test, replicating the behavior of the physical system. Through extensive review of the literature and collaboration with Safran Landing Systems, dynamic and structural behavior for the system were characterized and defined mathematically. Once defined, the characterized behaviors for the slave link assembly were then used to conduct a Multi-Body Dynamic (MBD) analysis to determine the dynamic and structural response of the system. These responses were then utilized in a topology optimization through the use of the Equivalent Static Load Method (ESLM). The results of the optimization were interpreted and later used to generate improved designs in terms of weight, cost, and structural performance under dynamic loading in stage three. The optimized designs were then validated using the model created for the MBD analysis of the baseline design. The design generation process employed two different approaches for post-processing the topology results produced. The first approach implemented a close replication of the topology results, resulting in a design with an overall peak stress increase of 74%, weight savings of 67%, and no apparent cost savings due to complex features present in the design. The second design approach focused on realizing reciprocating benefits for cost and weight savings. As a result, this design was able to achieve an overall peak stress increase of 6%, weight and cost savings of 36%, and 60%, respectively.
Chang, Joonho; Moon, Seung Ki; Jung, Kihyo; Kim, Wonmo; Parkinson, Matthew; Freivalds, Andris; Simpson, Timothy W; Baik, Seon Pill
2018-05-01
This study presents usability considerations and solutions for the design of glasses-type wearable computer displays and examines their effectiveness in a case study. Design countermeasures were investigated by a four-step design process: (1) preliminary design analysis; (2) design idea generation; (3) final design selection; and (4) virtual fitting trial. Three design interventions were devised from the design process: (1) weight balance to reduce pressure concentrated on the nose, (2) compliant temples to accommodate diverse head sizes and (3) a hanger mechanism to help spectacle users hang their wearable display on their eye glasses. To investigate their effectiveness, in the case study, the novel 3D glasses adopting the three interventions were compared with two existing 3D glasses in terms of neck muscle fatigue and subjective discomfort rating. While neck muscle fatigue was not significantly different among the three glasses (p = 0.467), the novel glasses had significantly smaller discomfort ratings (p = 0.009). Relevance to Industry: A four-step design process identified usability considerations and solutions for the design of glasses-type wearable computer displays. A novel 3D glasses was proposed through the process and its effectiveness was validated. The results identify design considerations and opportunities relevant to the emerging wearable display industry.
CHAM: weak signals detection through a new multivariate algorithm for process control
NASA Astrophysics Data System (ADS)
Bergeret, François; Soual, Carole; Le Gratiet, B.
2016-10-01
Derivatives technologies based on core CMOS processes are significantly aggressive in term of design rules and process control requirements. Process control plan is a derived from Process Assumption (PA) calculations which result in a design rule based on known process variability capabilities, taking into account enough margin to be safe not only for yield but especially for reliability. Even though process assumptions are calculated with a 4 sigma known process capability margin, efficient and competitive designs are challenging the process especially for derivatives technologies in 40 and 28nm nodes. For wafer fab process control, PA are declined in monovariate (layer1 CD, layer2 CD, layer2 to layer1 overlay, layer3 CD etc….) control charts with appropriated specifications and control limits which all together are securing the silicon. This is so far working fine but such system is not really sensitive to weak signals coming from interactions of multiple key parameters (high layer2 CD combined with high layer3 CD as an example). CHAM is a software using an advanced statistical algorithm specifically designed to detect small signals, especially when there are many parameters to control and when the parameters can interact to create yield issues. In this presentation we will first present the CHAM algorithm, then the case-study on critical dimensions, with the results, and we will conclude on future work. This partnership between Ippon and STM is part of E450LMDAP, European project dedicated to metrology and lithography development for future technology nodes, especially 10nm.
Modular and Adaptive Control of Sound Processing
NASA Astrophysics Data System (ADS)
van Nort, Douglas
This dissertation presents research into the creation of systems for the control of sound synthesis and processing. The focus differs from much of the work related to digital musical instrument design, which has rightly concentrated on the physicality of the instrument and interface: sensor design, choice of controller, feedback to performer and so on. Often times a particular choice of sound processing is made, and the resultant parameters from the physical interface are conditioned and mapped to the available sound parameters in an exploratory fashion. The main goal of the work presented here is to demonstrate the importance of the space that lies between physical interface design and the choice of sound manipulation algorithm, and to present a new framework for instrument design that strongly considers this essential part of the design process. In particular, this research takes the viewpoint that instrument designs should be considered in a musical control context, and that both control and sound dynamics must be considered in tandem. In order to achieve this holistic approach, the work presented in this dissertation assumes complementary points of view. Instrument design is first seen as a function of musical context, focusing on electroacoustic music and leading to a view on gesture that relates perceived musical intent to the dynamics of an instrumental system. The important design concept of mapping is then discussed from a theoretical and conceptual point of view, relating perceptual, systems and mathematically-oriented ways of examining the subject. This theoretical framework gives rise to a mapping design space, functional analysis of pertinent existing literature, implementations of mapping tools, instrumental control designs and several perceptual studies that explore the influence of mapping structure. Each of these reflect a high-level approach in which control structures are imposed on top of a high-dimensional space of control and sound synthesis parameters. In this view, desired gestural dynamics and sonic response are achieved through modular construction of mapping layers that are themselves subject to parametric control. Complementing this view of the design process, the work concludes with an approach in which the creation of gestural control/sound dynamics are considered in the low-level of the underlying sound model. The result is an adaptive system that is specialized to noise-based transformations that are particularly relevant in an electroacoustic music context. Taken together, these different approaches to design and evaluation result in a unified framework for creation of an instrumental system. The key point is that this framework addresses the influence that mapping structure and control dynamics have on the perceived feel of the instrument. Each of the results illustrate this using either top-down or bottom-up approaches that consider musical control context, thereby pointing to the greater potential for refined sonic articulation that can be had by combining them in the design process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aden, A.; Ruth, M.; Ibsen, K.
This report is an update of NREL's ongoing process design and economic analyses of processes related to developing ethanol from lignocellulosic feedstocks. The U.S. Department of Energy (DOE) is promoting the development of ethanol from lignocellulosic feedstocks as an alternative to conventional petroleum-based transportation fuels. DOE funds both fundamental and applied research in this area and needs a method for predicting cost benefits of many research proposals. To that end, the National Renewable Energy Laboratory (NREL) has modeled many potential process designs and estimated the economics of each process during the last 20 years. This report is an update ofmore » the ongoing process design and economic analyses at NREL. We envision updating this process design report at regular intervals; the purpose being to ensure that the process design incorporates all new data from NREL research, DOE funded research and other sources, and that the equipment costs are reasonable and consistent with good engineering practice for plants of this type. For the non-research areas this means using equipment and process approaches as they are currently used in industrial applications. For the last report, published in 1999, NREL performed a complete review and update of the process design and economic model for the biomass-to-ethanol process utilizing co-current dilute acid prehydrolysis with simultaneous saccharification (enzymatic) and co-fermentation. The process design included the core technologies being researched by the DOE: prehydrolysis, simultaneous saccharification and co-fermentation, and cellulase enzyme production. In addition, all ancillary areas--feed handling, product recovery and purification, wastewater treatment (WWT), lignin combustor and boiler-turbogenerator, and utilities--were included. NREL engaged Delta-T Corporation (Delta-T) to assist in the process design evaluation, the process equipment costing, and overall plant integration. The process design and costing for the lignin combustor and boiler turbogenerator was reviewed by Reaction Engineering Inc. (REI) and Merrick & Company reviewed the wastewater treatment. Since then, NREL has engaged Harris Group (Harris) to perform vendor testing, process design, and costing of critical equipment identified during earlier work. This included solid/liquid separation and pretreatment reactor design and costing. Corn stover handling was also investigated to support DOE's decision to focus on corn stover as a feedstock for lignocellulosic ethanol. Working with Harris, process design and costing for these areas were improved through vendor designs, costing, and vendor testing in some cases. In addition to this work, enzyme costs were adjusted to reflect collaborative work between NREL and enzyme manufacturers (Genencor International and Novozymes Biotech) to provide a delivered enzyme for lignocellulosic feedstocks. This report is the culmination of our work and represents an updated process design and cost basis for the process using a corn stover feedstock. The process design and economic model are useful for predicting the cost benefits of proposed research. Proposed research results can be translated into modifications of the process design, and the economic impact can be assessed. This allows DOE, NREL, and other researchers to set priorities on future research with an understanding of potential reductions to the ethanol production cost. To be economically viable, ethanol production costs must be below market values for ethanol. DOE has chosen a target ethanol selling price of $1.07 per gallon as a goal for 2010. The conceptual design and costs presented here are based on a 2010 plant start-up date. The key research targets required to achieve this design and the $1.07 value are discussed in the report.« less
NASA Astrophysics Data System (ADS)
Williams, Christopher Bryant
Low-density cellular materials, metallic bodies with gaseous voids, are a unique class of materials that are characterized by their high strength, low mass, good energy absorption characteristics, and good thermal and acoustic insulation properties. In an effort to take advantage of this entire suite of positive mechanical traits, designers are tailoring the cellular mesostructure for multiple design objectives. Unfortunately, existing cellular material manufacturing technologies limit the design space as they are limited to certain part mesostructure, material type, and macrostructure. The opportunity that exists to improve the design of existing products, and the ability to reap the benefits of cellular materials in new applications is the driving force behind this research. As such, the primary research goal of this work is to design, embody, and analyze a manufacturing process that provides a designer the ability to specify the material type, material composition, void morphology, and mesostructure topology for any conceivable part geometry. The accomplishment of this goal is achieved in three phases of research: (1) Design---Following a systematic design process and a rigorous selection exercise, a layer-based additive manufacturing process is designed that is capable of meeting the unique requirements of fabricating cellular material geometry. Specifically, metal parts of designed mesostructure are fabricated via three-dimensional printing of metal oxide ceramic powder followed by post-processing in a reducing atmosphere. (2) Embodiment ---The primary research hypothesis is verified through the use of the designed manufacturing process chain to successfully realize metal parts of designed mesostructure. (3) Modeling & Evaluation ---The designed manufacturing process is modeled in this final research phase so as to increase understanding of experimental results and to establish a foundation for future analytical modeling research. In addition to an analysis of the physics of primitive creation and an investigation of failure modes during the layered fabrication of thin trusses, build time and cost models are presented in order to verify claims of the process's economic benefits. The main contribution of this research is the embodiment of a novel manner for realizing metal parts of designed mesostructure.
Comparison of photo-Fenton, O3/H2O2/UV and photocatalytic processes for the treatment of gray water.
Hassanshahi, Nahid; Karimi-Jashni, Ayoub
2018-06-21
This research was carried out to compare and optimize the gray water treatment performance by the photo-Fenton, photocatalysis and ozone/H 2 O 2 /UV processes. Experimental design and optimization were carried out using Central Composite Design of Response Surface Methodology. The results of experiments showed that the most effective and influencing factors in photo-Fenton process were H 2 O 2 /Fe 2+ ratio, in ozone/H 2 O 2 /UV experiment were O 3 concentration, H 2 O 2 concentration, reaction time and pH and in photocatalytic process were TiO 2 concentration, pH and reaction time. The highest COD removal in photo-Fenton, ozone/H 2 O 2 /UV and photocatalytic process were 90%, 92% and 55%, respectively. The results were analyzed by design expert software and for all three processes second-order models were proposed to simulate the COD removal efficiency. In conclusion the ozone/H 2 O 2 /UV process is recommended for the treatment of gray water, since it was able to remove both COD and turbidity by 92% and 93%, respectively. Copyright © 2018 Elsevier Inc. All rights reserved.
Transforming nanomedicine manufacturing toward Quality by Design and microfluidics.
Colombo, Stefano; Beck-Broichsitter, Moritz; Bøtker, Johan Peter; Malmsten, Martin; Rantanen, Jukka; Bohr, Adam
2018-04-05
Nanopharmaceuticals aim at translating the unique features of nano-scale materials into therapeutic products and consequently their development relies critically on the progression in manufacturing technology to allow scalable processes complying with process economy and quality assurance. The relatively high failure rate in translational nanopharmaceutical research and development, with respect to new products on the market, is at least partly due to immature bottom-up manufacturing development and resulting sub-optimal control of quality attributes in nanopharmaceuticals. Recently, quality-oriented manufacturing of pharmaceuticals has undergone an unprecedented change toward process and product development interaction. In this context, Quality by Design (QbD) aims to integrate product and process development resulting in an increased number of product applications to regulatory agencies and stronger proprietary defense strategies of process-based products. Although QbD can be applied to essentially any production approach, microfluidic production offers particular opportunities for QbD-based manufacturing of nanopharmaceuticals. Microfluidics provides unique design flexibility, process control and parameter predictability, and also offers ample opportunities for modular production setups, allowing process feedback for continuously operating production and process control. The present review aims at outlining emerging opportunities in the synergistic implementation of QbD strategies and microfluidic production in contemporary development and manufacturing of nanopharmaceuticals. In doing so, aspects of design and development, but also technology management, are reviewed, as is the strategic role of these tools for aligning nanopharmaceutical innovation, development, and advanced industrialization in the broader pharmaceutical field. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Holzmann, Hubert; Massmann, Carolina
2015-04-01
A plenty of hydrological model types have been developed during the past decades. Most of them used a fixed design to describe the variable hydrological processes assuming to be representative for the whole range of spatial and temporal scales. This assumption is questionable as it is evident, that the runoff formation process is driven by dominant processes which can vary among different basins. Furthermore the model application and the interpretation of results is limited by data availability to identify the particular sub-processes, since most models were calibrated and validated only with discharge data. Therefore it can be hypothesized, that simpler model designs, focusing only on the dominant processes, can achieve comparable results with the benefit of less parameters. In the current contribution a modular model concept will be introduced, which allows the integration and neglection of hydrological sub-processes depending on the catchment characteristics and data availability. Key elements of the process modules refer to (1) storage effects (interception, soil), (2) transfer processes (routing), (3) threshold processes (percolation, saturation overland flow) and (4) split processes (rainfall excess). Based on hydro-meteorological observations in an experimental catchment in the Slovak region of the Carpathian mountains a comparison of several model realizations with different degrees of complexity will be discussed. A special focus is given on model parameter sensitivity estimated by Markov Chain Monte Carlo approach. Furthermore the identification of dominant processes by means of Sobol's method is introduced. It could be shown that a flexible model design - and even the simple concept - can reach comparable and equivalent performance than the standard model type (HBV-type). The main benefit of the modular concept is the individual adaptation of the model structure with respect to data and process availability and the option for parsimonious model design.
Piezoresistive Cantilever Performance—Part II: Optimization
Park, Sung-Jin; Doll, Joseph C.; Rastegar, Ali J.; Pruitt, Beth L.
2010-01-01
Piezoresistive silicon cantilevers fabricated by ion implantation are frequently used for force, displacement, and chemical sensors due to their low cost and electronic readout. However, the design of piezoresistive cantilevers is not a straightforward problem due to coupling between the design parameters, constraints, process conditions, and performance. We systematically analyzed the effect of design and process parameters on force resolution and then developed an optimization approach to improve force resolution while satisfying various design constraints using simulation results. The combined simulation and optimization approach is extensible to other doping methods beyond ion implantation in principle. The optimization results were validated by fabricating cantilevers with the optimized conditions and characterizing their performance. The measurement results demonstrate that the analytical model accurately predicts force and displacement resolution, and sensitivity and noise tradeoff in optimal cantilever performance. We also performed a comparison between our optimization technique and existing models and demonstrated eight times improvement in force resolution over simplified models. PMID:20333323
Methodology for CFD Design Analysis of National Launch System Nozzle Manifold
NASA Technical Reports Server (NTRS)
Haire, Scot L.
1993-01-01
The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.
Designing a freeform optic for oblique illumination
NASA Astrophysics Data System (ADS)
Uthoff, Ross D.; Ulanch, Rachel N.; Williams, Kaitlyn E.; Ruiz Diaz, Liliana; King, Page; Koshel, R. John
2017-11-01
The Functional Freeform Fitting (F4) method is utilized to design a freeform optic for oblique illumination of Mark Rothko's Green on Blue (1956). Shown are preliminary results from an iterative freeform design process; from problem definition and specification development to surface fit, ray tracing results, and optimization. This method is applicable to both point and extended sources of various geometries.
NASA Technical Reports Server (NTRS)
Hale, Mark A.
1996-01-01
Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.
2017-05-25
the planning process. Current US Army doctrine links conceptual planning to the Army Design Methodology and detailed planning to the Military...Decision Making Process. By associating conceptual and detailed planning with doctrinal methodologies , it is easy to regard the transition as a set period...plans into detailed directives resulting in changes to the operational environment. 15. SUBJECT TERMS Design; Army Design Methodology ; Conceptual
Multiprocessor graphics computation and display using transputers
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1988-01-01
A package of two-dimensional graphics routines was developed to run on a transputer-based parallel processing system. These routines were designed to enable applications programmers to easily generate and display results from the transputer network in a graphic format. The graphics procedures were designed for the lowest possible network communication overhead for increased performance. The routines were designed for ease of use and to present an intuitive approach to generating graphics on the transputer parallel processing system.
Micro Autonomous Systems Research: Systems Engineering Processes for Micro Autonomous Systems
2016-11-01
product family design and reconfigurable system design with recent developments in the fields of automated manufacturing and micro-autonomous...mapped to design parameters. These mappings are the mechanism by which physical product designs are formulated. Finally, manufacture of the product ... design tools and manufacturing and testing the resulting design . The final products were inspected and flight tested so that their
Emergent Aerospace Designs Using Negotiating Autonomous Agents
NASA Technical Reports Server (NTRS)
Deshmukh, Abhijit; Middelkoop, Timothy; Krothapalli, Anjaneyulu; Smith, Charles
2000-01-01
This paper presents a distributed design methodology where designs emerge as a result of the negotiations between different stake holders in the process, such as cost, performance, reliability, etc. The proposed methodology uses autonomous agents to represent design decision makers. Each agent influences specific design parameters in order to maximize their utility. Since the design parameters depend on the aggregate demand of all the agents in the system, design agents need to negotiate with others in the market economy in order to reach an acceptable utility value. This paper addresses several interesting research issues related to distributed design architectures. First, we present a flexible framework which facilitates decomposition of the design problem. Second, we present overview of a market mechanism for generating acceptable design configurations. Finally, we integrate learning mechanisms in the design process to reduce the computational overhead.
John F. Hunt
1998-01-01
The following results are preliminary, but show some basic information that will be used in an attempt to model pulp molded structures so that by measuring several basic fundamental properties of a fiber furnish and specifying process conditions, a molded structure could be designed for a particular performance need.
ERIC Educational Resources Information Center
Shibakawa, Mayumi
2012-01-01
The study documented the dynamic process of designing and implementing instructional interventions in an online course of Japanese language and culture at a two-year college. The results have impact in three distinct areas: pedagogical, theoretical, and methodological. First, the interventions that encouraged student agency with rich…
ROMPS critical design review data package
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1992-01-01
The design elements of the Robot-Operated Material Processing in Space (ROMPS) system are described in outline and graphical form. The following subsystems/topics are addressed: servo system, testbed and simulation results, System V Controller, robot module, furnace module, SCL experiment supervisor and script sample processing control, battery system, watchdog timers, mechanical/thermal considerations, and fault conditions and recovery.
PSK Shift Timing Information Detection Using Image Processing and a Matched Filter
2009-09-01
phase shifts are enhanced. Develop, design, and test the resulting phase shift identification scheme. xx Develop, design, and test an optional...and the resulting phase shift identification algorithm is investigated for SNR levels in the range -2dB to 12 dB. Detection performances are derived...test the resulting phase shift identification scheme. Develop, design, and test an optional analysis window overlapping technique to improve phase
Design of integration-ready metasurface-based infrared absorbers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogando, Karim, E-mail: karim@cab.cnea.gov.ar; Pastoriza, Hernán
2015-07-28
We introduce an integration ready design of metamaterial infrared absorber, highly compatible with many kinds of fabrication processes. We present the results of an exhaustive experimental characterization, including an analysis of the effects of single meta-atom geometrical parameters and collective arrangement. We confront the results with the theoretical interpretations proposed in the literature. Based on the results, we develop a set of practical design rules for metamaterial absorbers in the infrared region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAH, J; Shin, D; Manger, R
Purpose: To show how the Six Sigma DMAIC (Define-Measure-Analyze-Improve-Control) can be used for improving and optimizing the efficiency of patient-specific QA process by designing site-specific range tolerances. Methods: The Six Sigma tools (process flow diagram, cause and effect, capability analysis, Pareto chart, and control chart) were utilized to determine the steps that need focus for improving the patient-specific QA process. The patient-specific range QA plans were selected according to 7 treatment site groups, a total of 1437 cases. The process capability index, Cpm was used to guide the tolerance design of patient site-specific range. We also analyzed the financial impactmore » of this project. Results: Our results suggested that the patient range measurements were non-capable at the current tolerance level of ±1 mm in clinical proton plans. The optimized tolerances were calculated for treatment sites. Control charts for the patient QA time were constructed to compare QA time before and after the new tolerances were implemented. It is found that overall processing time was decreased by 24.3% after establishing new site-specific range tolerances. The QA failure for whole process in proton therapy would lead up to a 46% increase in total cost. This result can also predict how costs are affected by changes in adopting the tolerance design. Conclusion: We often believe that the quality and performance of proton therapy can easily be improved by merely tightening some or all of its tolerance requirements. This can become costly, however, and it is not necessarily a guarantee of better performance. The tolerance design is not a task to be undertaken without careful thought. The Six Sigma DMAIC can be used to improve the QA process by setting optimized tolerances. When tolerance design is optimized, the quality is reasonably balanced with time and cost demands.« less
Antúnez, Lucía; Vidal, Leticia; Sapolinski, Alejandra; Giménez, Ana; Maiche, Alejandro; Ares, Gastón
2013-08-01
The aim of this work was to evaluate consumer visual processing of food labels when evaluating the salt content of pan bread labels and to study the influence of label design and nutritional labelling format on consumer attention. A total of 16 pan bread labels, designed according to a full factorial design, were presented to 52 participants, who were asked to decide whether the sodium content of each label was medium or low, while their eye movements were recorded using an eye tracker. Results showed that most participants looked at nutrition labels and the traffic light system to conclude on the salt content of the labels. However, the average percentage of participants who looked at the actual sodium content was much lower. Nutrition information format affected participants' processing of nutrition information. Among other effects, the inclusion of the traffic light system increased participants' attention towards some kind of nutrition information and facilitated its processing, but not its understanding.
Developing a new industrial engineering curriculum using a systems engineering approach
NASA Astrophysics Data System (ADS)
Buyurgan, Nebil; Kiassat, Corey
2017-11-01
This paper reports on the development of an engineering curriculum for a new industrial engineering programme at a medium-sized private university in the northeast United States. A systems engineering process has been followed to design and develop the new curriculum. Considering the programme curriculum as a system, first the stakeholders have been identified, and some preliminary analysis on their needs and requirements has been conducted. Following that, the phases of conceptual design, preliminary design, and detailed design have been pursued during which different levels of validation, assessment, and evaluation processes have been utilised. In addition, a curriculum assessment and continuous improvement process have been developed to assess the curriculum and the courses frequently. The resulting curriculum is flexible, allowing the pursuit of accelerated graduate programmes, a second major, various minor options, and study-abroad; relevant, tailored to the needs of industry partners in the vicinity; and practical, providing hands-on education, resulting in employment-ready graduates.
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
Deployment Process, Mechanization, and Testing for the Mars Exploration Rovers
NASA Technical Reports Server (NTRS)
Iskenderian, Ted
2004-01-01
NASA's Mar Exploration Rover (MER) robotic prospectors were produced in an environment of unusually challenging schedule, volume, and mass restrictions. The technical challenges pushed the system s design towards extensive integration of function, which resulted in complex system engineering issues. One example of the system's integrated complexity can be found in the deployment process for the rover. Part of this process, rover "standup", is outlined in this paper. Particular attention is given to the Rover Lift Mechanism's (RLM) role and its design. Analysis methods are presented and compared to test results. It is shown that because prudent design principles were followed, a robust mechanism was created that minimized the duration of integration and test, and enabled recovery without perturbing related systems when reasonably foreseeable problems did occur. Examples of avoidable, unnecessary difficulty are also presented.
NASA Astrophysics Data System (ADS)
Millet, Charlyne; Oget, David; Cavallucci, Denis
2017-11-01
Innovation is a key component to the success and longevity of companies. Our research opens the 'black box' of creativity and innovation in R&D teams. We argue that understanding the nature of R&D projects in terms of creativity/innovation, efficiency/inefficiency, is important for designing education policies and improving engineering curriculum. Our research addresses the inventive design process, a lesser known aspect of the innovation process, in 197 R&D departments of industrial sector companies in France. One fundamental issue facing companies is to evaluate processes and results of innovation. Results show that the evaluation of innovation is confined by a lack of methodology of inventive projects. We will be establishing the foundations of a formal ontology for inventive design projects and finally some recommendations for engineering education.
Fashion sketch design by interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Mok, P. Y.; Wang, X. X.; Xu, J.; Kwok, Y. L.
2012-11-01
Computer aided design is vitally important for the modern industry, particularly for the creative industry. Fashion industry faced intensive challenges to shorten the product development process. In this paper, a methodology is proposed for sketch design based on interactive genetic algorithms. The sketch design system consists of a sketch design model, a database and a multi-stage sketch design engine. First, a sketch design model is developed based on the knowledge of fashion design to describe fashion product characteristics by using parameters. Second, a database is built based on the proposed sketch design model to define general style elements. Third, a multi-stage sketch design engine is used to construct the design. Moreover, an interactive genetic algorithm (IGA) is used to accelerate the sketch design process. The experimental results have demonstrated that the proposed method is effective in helping laypersons achieve satisfied fashion design sketches.
A method of network topology optimization design considering application process characteristic
NASA Astrophysics Data System (ADS)
Wang, Chunlin; Huang, Ning; Bai, Yanan; Zhang, Shuo
2018-03-01
Communication networks are designed to meet the usage requirements of users for various network applications. The current studies of network topology optimization design mainly considered network traffic, which is the result of network application operation, but not a design element of communication networks. A network application is a procedure of the usage of services by users with some demanded performance requirements, and has obvious process characteristic. In this paper, we first propose a method to optimize the design of communication network topology considering the application process characteristic. Taking the minimum network delay as objective, and the cost of network design and network connective reliability as constraints, an optimization model of network topology design is formulated, and the optimal solution of network topology design is searched by Genetic Algorithm (GA). Furthermore, we investigate the influence of network topology parameter on network delay under the background of multiple process-oriented applications, which can guide the generation of initial population and then improve the efficiency of GA. Numerical simulations show the effectiveness and validity of our proposed method. Network topology optimization design considering applications can improve the reliability of applications, and provide guidance for network builders in the early stage of network design, which is of great significance in engineering practices.
Survey: Computer Usage in Design Courses.
ERIC Educational Resources Information Center
Henley, Ernest J.
1983-01-01
Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are…
Patziger, Miklos; Günthert, Frank Wolfgang; Jardin, Norbert; Kainz, Harald; Londong, Jörg
2016-11-01
In state of the art wastewater treatment, primary settling tanks (PSTs) are considered as an integral part of the biological wastewater and sludge treatment process, as well as of the biogas and electric energy production. Consequently they strongly influence the efficiency of the entire wastewater treatment plant. However, in the last decades the inner physical processes of PSTs, largely determining their efficiency, have been poorly addressed. In common practice PSTs are still solely designed and operated based on the surface overflow rate and the hydraulic retention time (HRT) as a black box. The paper shows the results of a comprehensive investigation programme, including 16 PSTs. Their removal efficiency and inner physical processes (like the settling process of primary sludge), internal flow structures within PSTs and their impact on performance were investigated. The results show that: (1) the removal rates of PSTs are generally often underestimated in current design guidelines, (2) the removal rate of different PSTs shows a strongly fluctuating pattern even in the same range of the HRT, and (3) inlet design of PSTs becomes highly relevant in the removal efficiency at rather high surface overflow rates, above 5 m/h, which is the upper design limit of PSTs for dry weather load.
Optimal cure cycle design of a resin-fiber composite laminate
NASA Technical Reports Server (NTRS)
Hou, Jean W.; Sheen, Jeenson
1987-01-01
A unified computed aided design method was studied for the cure cycle design that incorporates an optimal design technique with the analytical model of a composite cure process. The preliminary results of using this proposed method for optimal cure cycle design are reported and discussed. The cure process of interest is the compression molding of a polyester which is described by a diffusion reaction system. The finite element method is employed to convert the initial boundary value problem into a set of first order differential equations which are solved simultaneously by the DE program. The equations for thermal design sensitivities are derived by using the direct differentiation method and are solved by the DE program. A recursive quadratic programming algorithm with an active set strategy called a linearization method is used to optimally design the cure cycle, subjected to the given design performance requirements. The difficulty of casting the cure cycle design process into a proper mathematical form is recognized. Various optimal design problems are formulated to address theses aspects. The optimal solutions of these formulations are compared and discussed.
Creating Learning Environment Connecting Engineering Design and 3D Printing
NASA Astrophysics Data System (ADS)
Pikkarainen, Ari; Salminen, Antti; Piili, Heidi
Engineering education in modern days require continuous development in didactics, pedagogics and used practical methods. 3D printing provides excellent opportunity to connect different engineering areas into practice and produce learning by doing applications. The 3D-printing technology used in this study is FDM (Fused deposition modeling). FDM is the most used 3D-printing technology by commercial numbers at the moment and the qualities of the technology makes it popular especially in academic environments. For achieving the best result possible, students will incorporate the principles of DFAM (Design for additive manufacturing) into their engineering design studies together with 3D printing. This paper presents a plan for creating learning environment for mechanical engineering students combining the aspects of engineering design, 3D-CAD learning and AM (additive manufacturing). As a result, process charts for carrying out the 3D printing process from technological point of view and design process for AM from engineering design point of view were created. These charts are used in engineering design education. The learning environment is developed to work also as a platform for Bachelor theses, work-training environment for students, prototyping service centre for cooperation partners and source of information for mechanical engineering education in Lapland University of Applied Sciences.
Metric integration architecture for product development
NASA Astrophysics Data System (ADS)
Sieger, David B.
1997-06-01
Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.
A concept ideation framework for medical device design.
Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar
2015-06-01
Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.
Software design as a problem in learning theory (a research overview)
NASA Technical Reports Server (NTRS)
Fass, Leona F.
1992-01-01
Our interest in automating software design has come out of our research in automated reasoning, inductive inference, learnability, and algebraic machine theory. We have investigated these areas extensively, in connection with specific problems of language representation, acquisition, processing, and design. In the case of formal context-free (CF) languages we established existence of finite learnable models ('behavioral realizations') and procedures for constructing them effectively. We also determined techniques for automatic construction of the models, inductively inferring them from finite examples of how they should 'behave'. These results were obtainable due to appropriate representation of domain knowledge, and constraints on the domain that the representation defined. It was when we sought to generalize our results, and adapt or apply them, that we began investigating the possibility of determining similar procedures for constructing correct software. Discussions with other researchers led us to examine testing and verification processes, as they are related to inference, and due to their considerable importance in correct software design. Motivating papers by other researchers, led us to examine these processes in some depth. Here we present our approach to those software design issues raised by other researchers, within our own theoretical context. We describe our results, relative to those of the other researchers, and conclude that they do not compare unfavorably.
DEFINITIVE SOX CONTROL PROCESS EVALUATIONS: LIMESTONE, DOUBLE ALKALI, AND CITRATE FGD PROCESSES
The report gives results of a detailed comparative technical and economic evaluation of limestone slurry, generic double alkali, and citrate flue gas desulfurization (FGD) processes, assuming proven technology and using representative power plant, process design, and economic pre...
New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program
NASA Technical Reports Server (NTRS)
Strain, D.; Levy, R.
1986-01-01
The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, B.E.
1995-04-01
A cross-functional team of process, product, quality, material, and design lab engineers was assembled to develop an environmentally friendly cleaning process for leadless chip carrier assemblies (LCCAs). Using flush and filter testing, Auger surface analysis, GC-Mass spectrophotometry, production yield results, and electrical testing results over an extended testing period, the team developed an aqueous cleaning process for LCCAs. The aqueous process replaced the Freon vapor degreasing/ultrasonic rinse process.
Simulative design and process optimization of the two-stage stretch-blow molding process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopmann, Ch.; Rasche, S.; Windeck, C.
2015-05-22
The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development timemore » and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.« less
Simulative design and process optimization of the two-stage stretch-blow molding process
NASA Astrophysics Data System (ADS)
Hopmann, Ch.; Rasche, S.; Windeck, C.
2015-05-01
The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.
Standardized Radiation Shield Design Methods: 2005 HZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.
2006-01-01
Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.
Information Flow in the Launch Vehicle Design/Analysis Process
NASA Technical Reports Server (NTRS)
Humphries, W. R., Sr.; Holland, W.; Bishop, R.
1999-01-01
This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.
Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples
NASA Technical Reports Server (NTRS)
Sunshine, Daniel
2010-01-01
The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.
Narasimhan, S; Chiel, H J; Bhunia, S
2011-04-01
Implantable microsystems for monitoring or manipulating brain activity typically require on-chip real-time processing of multichannel neural data using ultra low-power, miniaturized electronics. In this paper, we propose an integrated-circuit/architecture-level hardware design framework for neural signal processing that exploits the nature of the signal-processing algorithm. First, we consider different power reduction techniques and compare the energy efficiency between the ultra-low frequency subthreshold and conventional superthreshold design. We show that the superthreshold design operating at a much higher frequency can achieve comparable energy dissipation by taking advantage of extensive power gating. It also provides significantly higher robustness of operation and yield under large process variations. Next, we propose an architecture level preferential design approach for further energy reduction by isolating the critical computation blocks (with respect to the quality of the output signal) and assigning them higher delay margins compared to the noncritical ones. Possible delay failures under parameter variations are confined to the noncritical components, allowing graceful degradation in quality under voltage scaling. Simulation results using prerecorded neural data from the sea-slug (Aplysia californica) show that the application of the proposed design approach can lead to significant improvement in total energy, without compromising the output signal quality under process variations, compared to conventional design approaches.
Guetterman, Timothy C; Fetters, Michael D; Mawocha, Samkeliso; Legocki, Laurie J; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J
2017-01-01
Objectives: Clinical trials are complicated, expensive, time-consuming, and frequently do not lead to discoveries that improve the health of patients with disease. Adaptive clinical trials have emerged as a methodology to provide more flexibility in design elements to better answer scientific questions regarding whether new treatments are efficacious. Limited observational data exist that describe the complex process of designing adaptive clinical trials. To address these issues, the Adaptive Designs Accelerating Promising Treatments Into Trials project developed six, tailored, flexible, adaptive, phase-III clinical trials for neurological emergencies, and investigators prospectively monitored and observed the processes. The objective of this work is to describe the adaptive design development process, the final design, and the current status of the adaptive trial designs that were developed. Methods: To observe and reflect upon the trial development process, we employed a rich, mixed methods evaluation that combined quantitative data from visual analog scale to assess attitudes about adaptive trials, along with in-depth qualitative data about the development process gathered from observations. Results: The Adaptive Designs Accelerating Promising Treatments Into Trials team developed six adaptive clinical trial designs. Across the six designs, 53 attitude surveys were completed at baseline and after the trial planning process completed. Compared to baseline, the participants believed significantly more strongly that the adaptive designs would be accepted by National Institutes of Health review panels and non-researcher clinicians. In addition, after the trial planning process, the participants more strongly believed that the adaptive design would meet the scientific and medical goals of the studies. Conclusion: Introducing the adaptive design at early conceptualization proved critical to successful adoption and implementation of that trial. Involving key stakeholders from several scientific domains early in the process appears to be associated with improved attitudes towards adaptive designs over the life cycle of clinical trial development. PMID:29085638
NASA Astrophysics Data System (ADS)
Au, How Meng
The aircraft design process traditionally starts with a given set of top-level requirements. These requirements can be aircraft performance related such as the fuel consumption, cruise speed, or takeoff field length, etc., or aircraft geometry related such as the cabin height or cabin volume, etc. This thesis proposes a new aircraft design process in which some of the top-level requirements are not explicitly specified. Instead, these previously specified parameters are now determined through the use of the Price-Per-Value-Factor (PPVF) index. This design process is well suited for design projects where general consensus of the top-level requirements does not exist. One example is the design of small commuter airliners. The above mentioned value factor is comprised of productivity, cabin volume, cabin height, cabin pressurization, mission fuel consumption, and field length, each weighted to a different exponent. The relative magnitude and positive/negative signs of these exponents are in agreement with general experience. The value factors of the commuter aircraft are shown to have improved over a period of four decades. In addition, the purchase price is shown to vary linearly with the value factor. The initial aircraft sizing process can be manpower intensive if the calculations are done manually. By incorporating automation into the process, the design cycle can be shortened considerably. The Fortran program functions and subroutines in this dissertation, in addition to the design and optimization methodologies described above, contribute to the reduction of manpower required for the initial sizing process. By combining the new design process mentioned above and the PPVF as the objective function, an optimization study is conducted on the design of a 20-seat regional jet. Handbook methods for aircraft design are written into a Fortran code. A genetic algorithm is used as the optimization scheme. The result of the optimization shows that aircraft designed to this PPVF index can be competitive compared to existing turboprop commuter aircraft. The process developed can be applied to other classes of aircraft with the designer modifying the cost function based upon the design goals.
Oxygen Compatibility Assessment of Components and Systems
NASA Technical Reports Server (NTRS)
Stoltzfus, Joel; Sparks, Kyle
2010-01-01
Fire hazards are inherent in oxygen systems and a storied history of fires in rocket engine propulsion components exists. To detect and mitigate these fire hazards requires careful, detailed, and thorough analyses applied during the design process. The oxygen compatibility assessment (OCA) process designed by NASA Johnson Space Center (JSC) White Sands Test Facility (WSTF) can be used to determine the presence of fire hazards in oxygen systems and the likelihood of a fire. This process may be used as both a design guide and during the approval process to ensure proper design features and material selection. The procedure for performing an OCA is a structured step-by-step process to determine the most severe operating conditions; assess the flammability of the system materials at the use conditions; evaluate the presence and efficacy of ignition mechanisms; assess the potential for a fire to breach the system; and determine the reaction effect (the potential loss of life, mission, and system functionality as the result of a fire). This process should be performed for each component in a system. The results of each component assessment, and the overall system assessment, should be recorded in a report that can be used in the short term to communicate hazards and their mitigation and to aid in system/component development and, in the long term, to solve anomalies that occur during engine testing and operation.
The effects of DRIE operational parameters on vertically aligned micropillar arrays
NASA Astrophysics Data System (ADS)
Miller, Kane; Li, Mingxiao; Walsh, Kevin M.; Fu, Xiao-An
2013-03-01
Vertically aligned silicon micropillar arrays have been created by deep reactive ion etching (DRIE) and used for a number of microfabricated devices including microfluidic devices, micropreconcentrators and photovoltaic cells. This paper delineates an experimental design performed on the Bosch process of DRIE of micropillar arrays. The arrays are fabricated with direct-write optical lithography without photomask, and the effects of DRIE process parameters, including etch cycle time, passivation cycle time, platen power and coil power on profile angle, scallop depth and scallop peak-to-peak distance are studied by statistical design of experiments. Scanning electron microscope images are used for measuring the resultant profile angles and characterizing the scalloping effect on the pillar sidewalls. The experimental results indicate the effects of the determining factors, etch cycle time, passivation cycle time and platen power, on the micropillar profile angles and scallop depths. An optimized DRIE process recipe for creating nearly 90° and smooth surface (invisible scalloping) has been obtained as a result of the statistical design of experiments.
Adaptive design of visual perception experiments
NASA Astrophysics Data System (ADS)
O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja
2010-04-01
Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.
Design and Analysis of AN Static Aeroelastic Experiment
NASA Astrophysics Data System (ADS)
Hou, Ying-Yu; Yuan, Kai-Hua; Lv, Ji-Nan; Liu, Zi-Qiang
2016-06-01
Static aeroelastic experiments are very common in the United States and Russia. The objective of static aeroelastic experiments is to investigate deformation and loads of elastic structure in flow field. Generally speaking, prerequisite of this experiment is that the stiffness distribution of structure is known. This paper describes a method for designing experimental models, in the case where the stiffness distribution and boundary condition of a real aircraft are both uncertain. The stiffness distribution form of the structure can be calculated via finite element modeling and simulation calculation and F141 steels and rigid foam are used to make elastic model. In this paper, the design and manufacturing process of static aeroelastic models is presented and a set of experiment model was designed to simulate the stiffness of the designed wings, a set of experiments was designed to check the results. The test results show that the experimental method can effectively complete the design work of elastic model. This paper introduces the whole process of the static aeroelastic experiment, and the experimental results are analyzed. This paper developed a static aeroelasticity experiment technique and established an experiment model targeting at the swept wing of a certain kind of large aspect ratio aircraft.
Low-SWaP coincidence processing for Geiger-mode LIDAR video
NASA Astrophysics Data System (ADS)
Schultz, Steven E.; Cervino, Noel P.; Kurtz, Zachary D.; Brown, Myron Z.
2015-05-01
Photon-counting Geiger-mode lidar detector arrays provide a promising approach for producing three-dimensional (3D) video at full motion video (FMV) data rates, resolution, and image size from long ranges. However, coincidence processing required to filter raw photon counts is computationally expensive, generally requiring significant size, weight, and power (SWaP) and also time. In this paper, we describe a laboratory test-bed developed to assess the feasibility of low-SWaP, real-time processing for 3D FMV based on Geiger-mode lidar. First, we examine a design based on field programmable gate arrays (FPGA) and demonstrate proof-of-concept results. Then we examine a design based on a first-of-its-kind embedded graphical processing unit (GPU) and compare performance with the FPGA. Results indicate feasibility of real-time Geiger-mode lidar processing for 3D FMV and also suggest utility for real-time onboard processing for mapping lidar systems.
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.
Investigation of improved designs for rotational micromirrors using multiuser MEMS processes
NASA Astrophysics Data System (ADS)
Lin, Julianna E.; Michael, Feras S. J.; Kirk, Andrew G.
2001-04-01
In recent years, the design of rotational micromirrors for use in optical cross connects has received much attention. Although several companies have already produced and marketed a number of torsional mirror devices, more work is still needed to determine how these mirrors can be integrated into optical systems to form compact optical switches. However, recently several commercial MEMS foundry services have become available. Thus, due to the low cost of these prototyping services, new devices can be fabricated in short amounts of time and the designs adapted to meet the needs of different applications. The purpose of this work is to investigate the fabrication of new micromirror designs using the Multi-User MEMS Processes (MUMPs) foundry service available from Cronos Integrated Microsystems, located in North Carolina, USA). Several sets of mirror designs were submitted for fabrication and the resulting structures characterized using a phase-shifting Mirau interferometer. The results of these devices are presented.
Revenäs, Åsa; Martin, Cathrin; H Opava, Christina; Brusewitz, Maria; Keller, Christina; Åsenlöf, Pernilla
2015-09-17
User involvement in the development of health care services is important for the viability, usability, and effectiveness of services. This study reports on the second step of the co-design process. The aim was to explore the significant challenges in advancing the co-design process during the requirements specification phase of a mobile Internet service for the self-management of physical activity (PA) in rheumatoid arthritis (RA). A participatory action research design was used to involve lead users and stakeholders as co-designers. Lead users (n=5), a clinical physiotherapist (n=1), researchers (n=2) with knowledge in PA in RA and behavioral learning theories, an eHealth strategist (n=1), and an officer from the patient organization (n=1) collaborated in 4 workshops. Data-collection methods included video recordings and naturalistic observations. The inductive qualitative video-based analysis resulted in 1 overarching theme, merging perspectives, and 2 subthemes reflecting different aspects of merging: (1) finding a common starting point and (2) deciding on design solutions. Seven categories illustrated the specific challenges: reaching shared understanding of goals, clarifying and handling the complexity of participants' roles, clarifying terminology related to system development, establishing the rationale for features, negotiating features, transforming ideas into concrete features, and participants' alignment with the agreed goal and task. Co-designing the system requirements of a mobile Internet service including multiple stakeholders was a complex and extensive collaborative decision-making process. Considering, valuing, counterbalancing, and integrating different perspectives into agreements and solutions (ie, the merging of participants' perspectives) were crucial for moving the process forward and were considered the core challenges of co-design. Further research is needed to replicate the results and to increase knowledge on key factors for a successful co-design of health care services.
NASA Technical Reports Server (NTRS)
Mavris, Dimitri; Osburg, Jan
2005-01-01
An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Herasevich, Vitaly
2017-01-01
Background The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. Objective The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. Methods First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. Results The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Conclusions Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians’ needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as “apps.” A user-centered design process and usability evaluation should be considered during creation of these tools. PMID:28526675
"Glitch Logic" and Applications to Computing and Information Security
NASA Technical Reports Server (NTRS)
Stoica, Adrian; Katkoori, Srinivas
2009-01-01
This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.
A Multidisciplinary Approach to Mixer-Ejector Analysis and Design
NASA Technical Reports Server (NTRS)
Hendricks, Eric, S.; Seidel, Jonathan, A.
2012-01-01
The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.
Research on the design of fixture for motor vibration test
NASA Astrophysics Data System (ADS)
Shen, W. X.; Ma, W. S.; Zhang, L. W.
2018-03-01
The vibration reliability of the new energy automobile motor plays a very important role in driving safety, so it is very important to test the vibration durability of the motor. In the vibration test process, the fixture is very important, simulated road spectrum signal vibration can be transmitted without distortion to the motor through the fixture, fixture design directly affect the result of vibration endurance test. On the basis of new energy electric vehicle motor concrete structure, Two fixture design and fixture installation schemes for lateral cantilever type and base bearing type are put forward in this article, the selection of material, weighting process, middle alignment process and manufacturing process are summarized.The modal analysis and frequency response calculation of the fixture are carried out in this design, combine with influence caused by fixture height and structure profile on response frequency, the response frequency of each order of the fixture is calculated, then ultimately achieve the purpose of guiding the design.
Cycle time reduction by Html report in mask checking flow
NASA Astrophysics Data System (ADS)
Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon
2017-07-01
The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.
Advanced digital SAR processing study
NASA Technical Reports Server (NTRS)
Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.
1982-01-01
A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.
77 FR 38580 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-28
... CEDS planning process and resulting CEDS is designed to guide the economic growth of an area and...: Economic Development Administration (EDA). Title: Comprehensive Economic Development Strategies. OMB... and Economic Adjustment programs, applicants must undertake a planning process that results in a...
How Instructional Design Experts Use Knowledge and Experience to Solve Ill-Structured Problems
ERIC Educational Resources Information Center
Ertmer, Peggy A.; Stepich, Donald A.; York, Cindy S.; Stickman, Ann; Wu, Xuemei (Lily); Zurek, Stacey; Goktas, Yuksel
2008-01-01
This study examined how instructional design (ID) experts used their prior knowledge and previous experiences to solve an ill-structured instructional design problem. Seven experienced designers used a think-aloud procedure to articulate their problem-solving processes while reading a case narrative. Results, presented in the form of four…
What are you trying to learn? Study designs and the appropriate analysis for your research question
USDA-ARS?s Scientific Manuscript database
One fundamental necessity in the entire process of a well-performed study is the experimental design. A well-designed study can help researchers understand and have confidence in their results and analyses, and additionally the agreement or disagreement with the stated hypothesis. This well-designed...
Design and validation of the eyesafe ladar testbed (ELT) using the LadarSIM system simulator
NASA Astrophysics Data System (ADS)
Neilsen, Kevin D.; Budge, Scott E.; Pack, Robert T.; Fullmer, R. Rees; Cook, T. Dean
2009-05-01
The development of an experimental full-waveform LADAR system has been enhanced with the assistance of the LadarSIM system simulation software. The Eyesafe LADAR Test-bed (ELT) was designed as a raster scanning, single-beam, energy-detection LADAR with the capability of digitizing and recording the return pulse waveform at up to 2 GHz for 3D off-line image formation research in the laboratory. To assist in the design phase, the full-waveform LADAR simulation in LadarSIM was used to simulate the expected return waveforms for various system design parameters, target characteristics, and target ranges. Once the design was finalized and the ELT constructed, the measured specifications of the system and experimental data captured from the operational sensor were used to validate the behavior of the system as predicted during the design phase. This paper presents the methodology used, and lessons learned from this "design, build, validate" process. Simulated results from the design phase are presented, and these are compared to simulated results using measured system parameters and operational sensor data. The advantages of this simulation-based process are also presented.
A low-cost fabrication method for sub-millimeter wave GaAs Schottky diode
NASA Astrophysics Data System (ADS)
Jenabi, Sarvenaz; Deslandes, Dominic; Boone, Francois; Charlebois, Serge A.
2017-10-01
In this paper, a submillimeter-wave Schottky diode is designed and simulated. Effect of Schottky layer thickness on cut-off frequency is studied. A novel microfabrication process is proposed and implemented. The presented microfabrication process avoids electron-beam (e-beam) lithography which reduces the cost. Also, this process provides more flexibility in selection of design parameters and allows significant reduction in the device parasitic capacitance. A key feature of the process is that the Schottky contact, the air-bridges, and the transmission lines, are fabricated in a single lift-off step. This process relies on a planarization method that is suitable for trenches of 1-10 μm deep and is tolerant to end-point variations. The fabricated diode is measured and results are compared with simulations. A very good agreement between simulation and measurement results are observed.
Optimal Design of Material and Process Parameters in Powder Injection Molding
NASA Astrophysics Data System (ADS)
Ayad, G.; Barriere, T.; Gelin, J. C.; Song, J.; Liu, B.
2007-04-01
The paper is concerned with optimization and parametric identification for the different stages in Powder Injection Molding process that consists first in injection of powder mixture with polymer binder and then to the sintering of the resulting powders part by solid state diffusion. In the first part, one describes an original methodology to optimize the process and geometry parameters in injection stage based on the combination of design of experiments and an adaptive Response Surface Modeling. Then the second part of the paper describes the identification strategy that one proposes for the sintering stage, using the identification of sintering parameters from dilatometeric curves followed by the optimization of the sintering process. The proposed approaches are applied to the optimization of material and process parameters for manufacturing a ceramic femoral implant. One demonstrates that the proposed approach give satisfactory results.
NASA Technical Reports Server (NTRS)
Ankenman, Bruce; Ermer, Donald; Clum, James A.
1994-01-01
Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Junhwan; Hwang, Sungui; Park, Kyihwan, E-mail: khpark@gist.ac.kr
To utilize a time-of-flight-based laser scanner as a distance measurement sensor, the measurable distance and accuracy are the most important performance parameters to consider. For these purposes, the optical system and electronic signal processing of the laser scanner should be optimally designed in order to reduce a distance error caused by the optical crosstalk and wide dynamic range input. Optical system design for removing optical crosstalk problem is proposed in this work. Intensity control is also considered to solve the problem of a phase-shift variation in the signal processing circuit caused by object reflectivity. The experimental results for optical systemmore » and signal processing design are performed using 3D measurements.« less
Gong, Xingchu; Zhang, Ying; Pan, Jianyang; Qu, Haibin
2014-01-01
A solvent recycling reflux extraction process for Panax notoginseng was optimized using a design space approach to improve the batch-to-batch consistency of the extract. Saponin yields, total saponin purity, and pigment yield were defined as the process critical quality attributes (CQAs). Ethanol content, extraction time, and the ratio of the recycling ethanol flow rate and initial solvent volume in the extraction tank (RES) were identified as the critical process parameters (CPPs) via quantitative risk assessment. Box-Behnken design experiments were performed. Quadratic models between CPPs and process CQAs were developed, with determination coefficients higher than 0.88. As the ethanol concentration decreases, saponin yields first increase and then decrease. A longer extraction time leads to higher yields of the ginsenosides Rb1 and Rd. The total saponin purity increases as the ethanol concentration increases. The pigment yield increases as the ethanol concentration decreases or extraction time increases. The design space was calculated using a Monte-Carlo simulation method with an acceptable probability of 0.90. Normal operation ranges to attain process CQA criteria with a probability of more than 0.914 are recommended as follows: ethanol content of 79–82%, extraction time of 6.1–7.1 h, and RES of 0.039–0.040 min−1. Most of the results of the verification experiments agreed well with the predictions. The verification experiment results showed that the selection of proper operating ethanol content, extraction time, and RES within the design space can ensure that the CQA criteria are met. PMID:25470598
QMI: Rising to the Space Station Design Challenge
NASA Astrophysics Data System (ADS)
Carswell, W. E.; Farmer, J.; Coppens, C.; Breeding, S.; Rose, F.
2002-01-01
The Quench Module Insert (QMI) materials processing furnace is being designed to operate for 8000 hours over four years on the International Space Station as part of the first Materials Science Research Rack of the Materials Science Research Facility. The Bridgman-type furnace is being built for the directional solidification processing of metals and alloys in the microgravity environment of space. Most notably it will be used for processing aluminum and related alloys. Designing for the space station environment presents intriguing design challenges in the form of a ten-year life requirement coupled with both limited opportunities for maintenance and resource constraints in the form of limited power and space. The long life requirement has driven the design of several features in the furnace, including the design of the heater core, the selection and placement of the thermocouples, overall performance monitoring, and the design of the chill block. The power and space limitations have been addressed through a compact furnace design using efficient vacuum insulation. Details on these design features, as well as development test performance results to date, are presented.
QMI: Rising to the Space Station Design Challenge
NASA Technical Reports Server (NTRS)
Carswell, W. E.; Farmer, J.; Coppens, C.; Breeding, S.; Rose, F.; Curreri, Peter A. (Technical Monitor)
2002-01-01
The Quench Module Insert (QMI) materials processing furnace is being designed to operate for 8000 hours over four years on the International Space Station (ISS) as part of the first Materials Science Research Rack (MSRR-1) of the Materials Science Research Facility (MSRF). The Bridgman-type furnace is being built for the directional solidification processing of metals and alloys in the microgravity environment of space. Most notably it will be used for processing aluminum and related alloys. Designing for the space station environment presents intriguing design challenges in the form of a ten-year life requirement coupled with both limited opportunities for maintenance and resource constraints in the form of limited power and space. The long life requirement has driven the design of several features in the furnace, including the design of the heater core, the selection and placement of the thermocouples, overall performance monitoring, and the design of the chill block. The power and space limitations have been addressed through a compact furnace design using efficient vacuum insulation. Details on these design features, as well as development test performance results to date, are presented.
Plasma contactor research, 1989
NASA Technical Reports Server (NTRS)
Williams, John D.
1990-01-01
The characteristics of double layers observed by researchers investigating magnetospheric phenomena are contrasted to those observed in plasma contacting experiments. Experiments in the electron collection mode of the plasma contacting process were performed and the results confirm a simple model of this process for current levels ranging to 3 A. Experimental results were also obtained in a study of the process of electron emission from a hollow cathode plasma contactor. High energy ions are observed coming from the cathode in addition to the electrons and a phenomenological model that suggests a mechanism by which this could occur is presented. Experimental results showing the effects of the design parameters of the ambient plasma simulator on the plasma potential, electron temperature, electron density and plasma noise levels induced in plasma contacting experiments are presented. A preferred simulator design is selected on the basis of these results.
A system level model for preliminary design of a space propulsion solid rocket motor
NASA Astrophysics Data System (ADS)
Schumacher, Daniel M.
Preliminary design of space propulsion solid rocket motors entails a combination of components and subsystems. Expert design tools exist to find near optimal performance of subsystems and components. Conversely, there is no system level preliminary design process for space propulsion solid rocket motors that is capable of synthesizing customer requirements into a high utility design for the customer. The preliminary design process for space propulsion solid rocket motors typically builds on existing designs and pursues feasible rather than the most favorable design. Classical optimization is an extremely challenging method when dealing with the complex behavior of an integrated system. The complexity and combinations of system configurations make the number of the design parameters that are traded off unreasonable when manual techniques are used. Existing multi-disciplinary optimization approaches generally address estimating ratios and correlations rather than utilizing mathematical models. The developed system level model utilizes the Genetic Algorithm to perform the necessary population searches to efficiently replace the human iterations required during a typical solid rocket motor preliminary design. This research augments, automates, and increases the fidelity of the existing preliminary design process for space propulsion solid rocket motors. The system level aspect of this preliminary design process, and the ability to synthesize space propulsion solid rocket motor requirements into a near optimal design, is achievable. The process of developing the motor performance estimate and the system level model of a space propulsion solid rocket motor is described in detail. The results of this research indicate that the model is valid for use and able to manage a very large number of variable inputs and constraints towards the pursuit of the best possible design.
Icon and user interface design for emergency medical information systems: a case study.
Salman, Y Batu; Cheng, Hong-In; Patterson, Patrick E
2012-01-01
A usable medical information system should allow for reliable and accurate interaction between users and the system in emergencies. A participatory design approach was used to develop a medical information system in two Turkish hospitals. The process consisted of task and user analysis, an icon design survey, initial icon design, final icon design and evaluation, and installation of the iconic medical information system with the icons. We observed work sites to note working processes and tasks related to the information system and interviewed medical personnel. Emergency personnel then participated in the design process to develop a usable graphical user interface, by drawing icon sketches for 23 selected tasks. Similar sketches were requested for specific tasks such as family medical history, contact information, translation, addiction, required inspections, requests and applications, and nurse observations. The sketches were analyzed and redesigned into computer icons by professional designers and the research team. A second group of physicians and nurses then tested the understandability of the icons. The user interface layout was examined and evaluated by system users, followed by the system's installation. Medical personnel reported the participatory design process was interesting and believed the resulting designs would be more familiar and friendlier. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Systematic procedure for designing processes with multiple environmental objectives.
Kim, Ki-Joo; Smith, Raymond L
2005-04-01
Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.
Fox-7 for Insensitive Boosters
2010-08-01
cavitation , and therefore nucleation, to occur at each frequency. As well as producing ultrasound at different frequencies, the method of delivery of...processing techniques using ultrasound , designed to optimise FOX-7 crystal size and morphology to improve booster formulations, and results from these...7 booster formulations. Also included are particle processing techniques using ultrasound , designed to optimise FOX-7 crystal size and morphology
A Path to Successful Energy Retrofits: Early Collaboration through Integrated Project Delivery Teams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parrish, Kristen
2012-10-01
This document guides you through a process for the early design phases of retrofit projects to help you mitigate frustrations commonly experienced by building owners and designers. It outlines the value of forming an integrated project delivery team and developing a communication and information-sharing infrastructure that fosters collaboration. This guide does not present a complete process for designing an energy retrofit for a building. Instead, it focuses on the early design phase tasks related to developing and selecting energy efficiency measures (EEMs) that benefit from collaboration, and highlights the resulting advantages.
Artifact-Based Transformation of IBM Global Financing
NASA Astrophysics Data System (ADS)
Chao, Tian; Cohn, David; Flatgard, Adrian; Hahn, Sandy; Linehan, Mark; Nandi, Prabir; Nigam, Anil; Pinel, Florian; Vergo, John; Wu, Frederick Y.
IBM Global Financing (IGF) is transforming its business using the Business Artifact Method, an innovative business process modeling technique that identifies key business artifacts and traces their life cycles as they are processed by the business. IGF is a complex, global business operation with many business design challenges. The Business Artifact Method is a fundamental shift in how to conceptualize, design and implement business operations. The Business Artifact Method was extended to solve the problem of designing a global standard for a complex, end-to-end process while supporting local geographic variations. Prior to employing the Business Artifact method, process decomposition, Lean and Six Sigma methods were each employed on different parts of the financing operation. Although they provided critical input to the final operational model, they proved insufficient for designing a complete, integrated, standard operation. The artifact method resulted in a business operations model that was at the right level of granularity for the problem at hand. A fully functional rapid prototype was created early in the engagement, which facilitated an improved understanding of the redesigned operations model. The resulting business operations model is being used as the basis for all aspects of business transformation in IBM Global Financing.
Advanced composite rudders for DC-10 aircraft: Design, manufacturing, and ground tests
NASA Technical Reports Server (NTRS)
Lehman, G. M.; Purdy, D. M.; Cominsky, A.; Hawley, A. V.; Amason, M. P.; Kung, J. T.; Palmer, R. J.; Purves, N. B.; Marra, P. J.; Hancock, G. R.
1976-01-01
Design synthesis, tooling and process development, manufacturing, and ground testing of a graphite epoxy rudder for the DC-10 commercial transport are discussed. The composite structure was fabricated using a unique processing method in which the thermal expansion characteristics of rubber tooling mandrels were used to generate curing pressures during an oven cure cycle. The ground test program resulted in certification of the rudder for passenger-carrying flights. Results of the structural and environmental tests are interpreted and detailed development of the rubber tooling and manufacturing process is described. Processing, tooling, and manufacturing problems encountered during fabrication of four development rudders and ten flight-service rudders are discussed and the results of corrective actions are described. Non-recurring and recurring manufacturing labor man-hours are tabulated at the detailed operation level. A weight reduction of 13.58 kg (33 percent) was attained in the composite rudder.
Pflug, Irving J
2010-05-01
The incidence of botulism in canned food in the last century is reviewed along with the background science; a few conclusions are reached based on analysis of published data. There are two primary aspects to botulism control: the design of an adequate process and the delivery of the adequate process to containers of food. The probability that the designed process will not be adequate to control Clostridium botulinum is very small, probably less than 1.0 x 10(-6), based on containers of food, whereas the failure of the operator of the processing equipment to deliver the specified process to containers of food may be of the order of 1 in 40, to 1 in 100, based on processing units (retort loads). In the commercial food canning industry, failure to deliver the process will probably be of the order of 1.0 x 10(-4) to 1.0 x 10(-6) when U.S. Food and Drug Administration (FDA) regulations are followed. Botulism incidents have occurred in food canning plants that have not followed the FDA regulations. It is possible but very rare to have botulism result from postprocessing contamination. It may thus be concluded that botulism incidents in canned food are primarily the result of human failure in the delivery of the designed or specified process to containers of food that, in turn, result in the survival, outgrowth, and toxin production of C. botulinum spores. Therefore, efforts in C. botulinum control should be concentrated on reducing human errors in the delivery of the specified process to containers of food.
High-performance data processing using distributed computing on the SOLIS project
NASA Astrophysics Data System (ADS)
Wampler, Stephen
2002-12-01
The SOLIS solar telescope collects data at a high rate, resulting in 500 GB of raw data each day. The SOLIS Data Handling System (DHS) has been designed to quickly process this data down to 156 GB of reduced data. The DHS design uses pools of distributed reduction processes that are allocated to different observations as needed. A farm of 10 dual-cpu Linux boxes contains the pools of reduction processes. Control is through CORBA and data is stored on a fibre channel storage area network (SAN). Three other Linux boxes are responsible for pulling data from the instruments using SAN-based ringbuffers. Control applications are Java-based while the reduction processes are written in C++. This paper presents the overall design of the SOLIS DHS and provides details on the approach used to control the pooled reduction processes. The various strategies used to manage the high data rates are also covered.
A Web-Based Monitoring System for Multidisciplinary Design Projects
NASA Technical Reports Server (NTRS)
Rogers, James L.; Salas, Andrea O.; Weston, Robert P.
1998-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.
A Web-Based System for Monitoring and Controlling Multidisciplinary Design Projects
NASA Technical Reports Server (NTRS)
Salas, Andrea O.; Rogers, James L.
1997-01-01
In today's competitive environment, both industry and government agencies are under enormous pressure to reduce the time and cost of multidisciplinary design projects. A number of frameworks have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. An examination of current frameworks reveals weaknesses in various areas such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, in conjunction with an existing framework, can improve these areas of weakness. This paper describes a system that executes a sequence of programs, monitors and controls the design process through a Web-based interface, and visualizes intermediate and final results through the use of Java(Tm) applets. A small sample problem, which includes nine processes with two analysis programs that are coupled to an optimizer, is used to demonstrate the feasibility of this approach.
Signal processing of anthropometric data
NASA Astrophysics Data System (ADS)
Zimmermann, W. J.
1983-09-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Signal processing of anthropometric data
NASA Technical Reports Server (NTRS)
Zimmermann, W. J.
1983-01-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Liese, Eric; Zitney, Stephen E.
2017-06-26
A multi-stage centrifugal compressor model is presented with emphasis on analyzing use of an exit flow coefficient vs. an inlet flow coefficient performance parameter to predict off-design conditions in the critical region of a supercritical carbon dioxide (CO 2) power cycle. A description of the performance parameters is given along with their implementation in a design model (number of stages, basic sizing, etc.) and a dynamic model (for use in transient studies). A design case is shown for two compressors, a bypass compressor and a main compressor, as defined in a process simulation of a 10 megawatt (MW) supercritical COmore » 2 recompression Brayton cycle. Simulation results are presented for a simple open cycle and closed cycle process with changes to the inlet temperature of the main compressor which operates near the CO 2 critical point. Results showed some difference in results using the exit vs. inlet flow coefficient correction, however, it was not significant for the range of conditions examined. Here, this paper also serves as a reference for future works, including a full process simulation of the 10 MW recompression Brayton cycle.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liese, Eric; Zitney, Stephen E.
A multi-stage centrifugal compressor model is presented with emphasis on analyzing use of an exit flow coefficient vs. an inlet flow coefficient performance parameter to predict off-design conditions in the critical region of a supercritical carbon dioxide (CO 2) power cycle. A description of the performance parameters is given along with their implementation in a design model (number of stages, basic sizing, etc.) and a dynamic model (for use in transient studies). A design case is shown for two compressors, a bypass compressor and a main compressor, as defined in a process simulation of a 10 megawatt (MW) supercritical COmore » 2 recompression Brayton cycle. Simulation results are presented for a simple open cycle and closed cycle process with changes to the inlet temperature of the main compressor which operates near the CO 2 critical point. Results showed some difference in results using the exit vs. inlet flow coefficient correction, however, it was not significant for the range of conditions examined. Here, this paper also serves as a reference for future works, including a full process simulation of the 10 MW recompression Brayton cycle.« less
Trajectory Dispersed Vehicle Process for Space Launch System
NASA Technical Reports Server (NTRS)
Statham, Tamara; Thompson, Seth
2017-01-01
The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.
Portalés, Cristina; Casas, Sergio; Gimeno, Jesús; Fernández, Marcos; Poza, Montse
2018-04-19
Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes.
Fernández, Marcos; Poza, Montse
2018-01-01
Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes. PMID:29671799
NASA Technical Reports Server (NTRS)
Zawadzki, Mark; Rengarajan, Sembiam; Hodges, Richard E.
2005-01-01
While the design of waveguide slot arrays in not new, this particular design effort shows that very good results can be achieved on a first attempt using established slot array design techniques and commercial software for the waveguide power divider network. The presentation will discuss this design process in detail.
ERIC Educational Resources Information Center
Bollen, Lars; van der Meij, Hans; Leemkuil, Henny; McKenney, Susan
2015-01-01
A digital learning and performance support environment for university student design tasks was developed. This paper describes on the design rationale, process, and the usage results to arrive at a core set of design principles for the construction of such an environment. We present a collection of organizational, technical, and course-related…
Approach to design space from retrospective quality data.
Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon
2016-01-01
Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.
Experimental design of a twin-column countercurrent gradient purification process.
Steinebach, Fabian; Ulmer, Nicole; Decker, Lara; Aumann, Lars; Morbidelli, Massimo
2017-04-07
As typical for separation processes, single unit batch chromatography exhibits a trade-off between purity and yield. The twin-column MCSGP (multi-column countercurrent solvent gradient purification) process allows alleviating such trade-offs, particularly in the case of difficult separations. In this work an efficient and reliable procedure for the design of the twin-column MCSGP process is developed. This is based on a single batch chromatogram, which is selected as the design chromatogram. The derived MCSGP operation is not intended to provide optimal performance, but it provides the target product in the selected fraction of the batch chromatogram, but with higher yield. The design procedure is illustrated for the isolation of the main charge isoform of a monoclonal antibody from Protein A eluate with ion-exchange chromatography. The main charge isoform was obtained at a purity and yield larger than 90%. At the same time process related impurities such as HCP and leached Protein A as well as aggregates were at least equally well removed. Additionally, the impact of several design parameters on the process performance in terms of purity, yield, productivity and buffer consumption is discussed. The obtained results can be used for further fine-tuning of the process parameters so as to improve its performance. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Darmawan, Tofiq Dwiki; Priadythama, Ilham; Herdiman, Lobes
2018-02-01
Welding and drilling are main processes of making chair frame from metal material. Commonly, chair frame construction includes many arcs which bring difficulties for its welding and drilling process. In UNS industrial engineering integrated practicum there are welding fixtures which use to fixing frame component position for welding purpose. In order to achieve exact holes position for assembling purpose, manual drilling processes were conducted after the frame was joined. Unfortunately, after it was welded the frame material become hard and increase drilling tools wear rate as well as reduce holes position accuracy. The previous welding fixture was not equipped with clamping system and cannot accommodate drilling process. To solve this problem, our idea is to reorder the drilling process so that it can be execute before welding. Thus, this research aims to propose conceptual design of modular fixture which can integrate welding and drilling process. We used Generic Product Development Process to address the design concept. We collected design requirements from 3 source, jig and fixture theoretical concepts, user requirements, and clamping part standards. From 2 alternatives fixture tables, we propose the first which equipped with mounting slots instead of holes. We test the concept by building a full sized prototype and test its works by conducting welding and drilling of a student chair frame. Result from the welding and drilling trials showed that the holes are on precise position after welding. Based on this result, we conclude that the concept can be a consideration for application in UNS Industrial Engineering Integrated Practicum.
Developing a Model for ePortfolio Design: A Studio Approach
ERIC Educational Resources Information Center
Carpenter, Russell; Apostel, Shawn; Hyndman, June Overton
2012-01-01
After developing and testing a model for integrative collaboration at Eastern Kentucky University's Noel Studio for Academic Creativity, we offer results that highlight the potential for peer review to significantly and positively impact the ePortfolio design process for students. The results of this classroom/studio collaboration suggest that…
Optimization of chlorine fluxing process for magnesium removal from molten aluminum
NASA Astrophysics Data System (ADS)
Fu, Qian
High-throughput and low operational cost are the keys to a successful industrial process. Much aluminum is now recycled in the form of used beverage cans and this aluminum is of alloys that contain high levels of magnesium. It is common practice to "demag" the metal by injecting chlorine that preferentially reacts with the magnesium. In the conventional chlorine fluxing processes, low reaction efficiency results in excessive reactive gas emissions. In this study, through an experimental investigation of the reaction kinetics involved in this process, a mathematical model is set up for the purpose of process optimization. A feedback controlled chlorine reduction process strategy is suggested for demagging the molten aluminum to the desired magnesium level without significant gas emissions. This strategy also needs the least modification of the existing process facility. The suggested process time will only be slightly longer than conventional methods and chlorine usage and emissions will be reduced. In order to achieve process optimization through novel designs in any fluxing process, a system is necessary for measuring the bubble distribution in liquid metals. An electro-resistivity probe described in the literature has low accuracy and its capability to measure bubble distribution has not yet been fully demonstrated. A capacitance bubble probe was designed for bubble measurements in molten metals. The probe signal was collected and processed digitally. Higher accuracy was obtained by higher discrimination against corrupted signals. A single-size bubble experiment in Belmont metal was designed to reveal the characteristic response of the capacitance probe. This characteristic response fits well with a theoretical model. It is suggested that using a properly designed deconvolution process, the actual bubble size distribution can be calculated. The capacitance probe was used to study some practical bubble generation devices. Preliminary results on bubble distribution generated by a porous plug in Belmont metal showed bubbles much bigger than those in a water model. Preliminary results in molten aluminum showed that the probe was applicable in this harsh environment. An interesting bubble coalescence phenomenon was also observed in both Belmont metal and molten aluminum.
NASA Astrophysics Data System (ADS)
Donders, S.; Pluymers, B.; Ragnarsson, P.; Hadjit, R.; Desmet, W.
2010-04-01
In the vehicle design process, design decisions are more and more based on virtual prototypes. Due to competitive and regulatory pressure, vehicle manufacturers are forced to improve product quality, to reduce time-to-market and to launch an increasing number of design variants on the global market. To speed up the design iteration process, substructuring and component mode synthesis (CMS) methods are commonly used, involving the analysis of substructure models and the synthesis of the substructure analysis results. Substructuring and CMS enable efficient decentralized collaboration across departments and allow to benefit from the availability of parallel computing environments. However, traditional CMS methods become prohibitively inefficient when substructures are coupled along large interfaces, i.e. with a large number of degrees of freedom (DOFs) at the interface between substructures. The reason is that the analysis of substructures involves the calculation of a number of enrichment vectors, one for each interface degree of freedom (DOF). Since large interfaces are common in vehicles (e.g. the continuous line connections to connect the body with the windshield, roof or floor), this interface bottleneck poses a clear limitation in the vehicle noise, vibration and harshness (NVH) design process. Therefore there is a need to describe the interface dynamics more efficiently. This paper presents a wave-based substructuring (WBS) approach, which allows reducing the interface representation between substructures in an assembly by expressing the interface DOFs in terms of a limited set of basis functions ("waves"). As the number of basis functions can be much lower than the number of interface DOFs, this greatly facilitates the substructure analysis procedure and results in faster design predictions. The waves are calculated once from a full nominal assembly analysis, but these nominal waves can be re-used for the assembly of modified components. The WBS approach thus enables efficient structural modification predictions of the global modes, so that efficient vibro-acoustic design modification, optimization and robust design become possible. The results show that wave-based substructuring offers a clear benefit for vehicle design modifications, by improving both the speed of component reduction processes and the efficiency and accuracy of design iteration predictions, as compared to conventional substructuring approaches.
Detailed Modeling of Distillation Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.
2011-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA?s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents efforts to develop chemical process simulations for three technologies: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system and the Wiped-Film Rotating Disk (WFRD) using the Aspen Custom Modeler and Aspen Plus process simulation tools. The paper discusses system design, modeling details, and modeling results for each technology and presents some comparisons between the model results and recent test data. Following these initial comparisons, some general conclusions and forward work are discussed.
Tchamna, Rodrigue; Lee, Moonyong
2018-01-01
This paper proposes a novel optimization-based approach for the design of an industrial two-term proportional-integral (PI) controller for the optimal regulatory control of unstable processes subjected to three common operational constraints related to the process variable, manipulated variable and its rate of change. To derive analytical design relations, the constrained optimal control problem in the time domain was transformed into an unconstrained optimization problem in a new parameter space via an effective parameterization. The resulting optimal PI controller has been verified to yield optimal performance and stability of an open-loop unstable first-order process under operational constraints. The proposed analytical design method explicitly takes into account the operational constraints in the controller design stage and also provides useful insights into the optimal controller design. Practical procedures for designing optimal PI parameters and a feasible constraint set exclusive of complex optimization steps are also proposed. The proposed controller was compared with several other PI controllers to illustrate its performance. The robustness of the proposed controller against plant-model mismatch has also been investigated. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Noise tolerant illumination optimization applied to display devices
NASA Astrophysics Data System (ADS)
Cassarly, William J.; Irving, Bruce
2005-02-01
Display devices have historically been designed through an iterative process using numerous hardware prototypes. This process is effective but the number of iterations is limited by the time and cost to make the prototypes. In recent years, virtual prototyping using illumination software modeling tools has replaced many of the hardware prototypes. Typically, the designer specifies the design parameters, builds the software model, predicts the performance using a Monte Carlo simulation, and uses the performance results to repeat this process until an acceptable design is obtained. What is highly desired, and now possible, is to use illumination optimization to automate the design process. Illumination optimization provides the ability to explore a wider range of design options while also providing improved performance. Since Monte Carlo simulations are often used to calculate the system performance but those predictions have statistical uncertainty, the use of noise tolerant optimization algorithms is important. The use of noise tolerant illumination optimization is demonstrated by considering display device designs that extract light using 2D paint patterns as well as 3D textured surfaces. A hybrid optimization approach that combines a mesh feedback optimization with a classical optimizer is demonstrated. Displays with LED sources and cold cathode fluorescent lamps are considered.
Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter
NASA Technical Reports Server (NTRS)
Aggarwal, Pravin; Hull, Patrick V.
2015-01-01
Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.
Dorati, Rossella; DeTrizio, Antonella; Genta, Ida; Grisoli, Pietro; Merelli, Alessia; Tomasi, Corrado; Conti, Bice
2016-01-01
The present paper takes into account the DOE application to the preparation process of biodegradable microspheres for osteomyelitis local therapy. With this goal gentamicin loaded polylactide-co-glycolide-copolyethyleneglycol (PLGA-PEG) microspheres were prepared and investigated. Two preparation protocols (o/w and w/o/w) with different process conditions, and three PLGA-PEG block copolymers with different compositions of lactic and glycolic acids and PEG, were tested. A Design Of Experiment (DOE) screening design was applied as an approach to scale up manufacturing step. The results of DOE screening design confirmed that w/o/w technique, the presence of salt and the 15%w/v polymer concentration positively affected the EE% (72.1-97.5%), and span values of particle size distribution (1.03-1.23), while salt addition alone negatively affected the yield process. Process scale up resulted in a decrease of gentamicin EE% that can be attributed to the high volume of water used to remove PVA and NaCl residues. The results of in vitro gentamicin release study show prolonged gentamicin release up to three months from the microspheres prepared with salt addition in the dispersing phase; the behavior being consistent with their highly compact structure highlighted by scanning electron microscopy analysis. The prolonged release of gentamicin is maintained even after embedding the biodegradable microspheres into a thermosetting composite gel made of chitosan and acellular bovine bone matrix (Orthoss® granules), and the microbiologic evaluation demonstrated the efficacy of the gentamicin loaded microspheres on Escherichia coli. The collected results confirm the feasibility of the scale up of microsphere manufacturing process and the high potential of the microparticulate drug delivery system to be used for the local antibiotic delivery to bone.
NASA Technical Reports Server (NTRS)
Siarto, Jeff; Reese, Mark; Shum, Dana; Baynes, Katie
2016-01-01
User experience and visual design are greatly improved when usability testing is performed on a periodic basis. Design decisions should be tested by real users so that application owners can understand the effectiveness of each decision and identify areas for improvement. It is important that applications be tested not just once, but as a part of a continuing process that looks to build upon previous tests. NASA's Earthdata Search Client has undergone a usability study to ensure its users' needs are being met and that users understand how to use the tool efficiently and effectively. This poster will highlight the process followed for usability study, the results of the study, and what has been implemented in light of the results to improve the application's interface.
Algorithme intelligent d'optimisation d'un design structurel de grande envergure
NASA Astrophysics Data System (ADS)
Dominique, Stephane
The implementation of an automated decision support system in the field of design and structural optimisation can give a significant advantage to any industry working on mechanical designs. Indeed, by providing solution ideas to a designer or by upgrading existing design solutions while the designer is not at work, the system may reduce the project cycle time, or allow more time to produce a better design. This thesis presents a new approach to automate a design process based on Case-Based Reasoning (CBR), in combination with a new genetic algorithm named Genetic Algorithm with Territorial core Evolution (GATE). This approach was developed in order to reduce the operating cost of the process. However, as the system implementation cost is quite expensive, the approach is better suited for large scale design problem, and particularly for design problems that the designer plans to solve for many different specification sets. First, the CBR process uses a databank filled with every known solution to similar design problems. Then, the closest solutions to the current problem in term of specifications are selected. After this, during the adaptation phase, an artificial neural network (ANN) interpolates amongst known solutions to produce an additional solution to the current problem using the current specifications as inputs. Each solution produced and selected by the CBR is then used to initialize the population of an island of the genetic algorithm. The algorithm will optimise the solution further during the refinement phase. Using progressive refinement, the algorithm starts using only the most important variables for the problem. Then, as the optimisation progress, the remaining variables are gradually introduced, layer by layer. The genetic algorithm that is used is a new algorithm specifically created during this thesis to solve optimisation problems from the field of mechanical device structural design. The algorithm is named GATE, and is essentially a real number genetic algorithm that prevents new individuals to be born too close to previously evaluated solutions. The restricted area becomes smaller or larger during the optimisation to allow global or local search when necessary. Also, a new search operator named Substitution Operator is incorporated in GATE. This operator allows an ANN surrogate model to guide the algorithm toward the most promising areas of the design space. The suggested CBR approach and GATE were tested on several simple test problems, as well as on the industrial problem of designing a gas turbine engine rotor's disc. These results are compared to other results obtained for the same problems by many other popular optimisation algorithms, such as (depending of the problem) gradient algorithms, binary genetic algorithm, real number genetic algorithm, genetic algorithm using multiple parents crossovers, differential evolution genetic algorithm, Hookes & Jeeves generalized pattern search method and POINTER from the software I-SIGHT 3.5. Results show that GATE is quite competitive, giving the best results for 5 of the 6 constrained optimisation problem. GATE also provided the best results of all on problem produced by a Maximum Set Gaussian landscape generator. Finally, GATE provided a disc 4.3% lighter than the best other tested algorithm (POINTER) for the gas turbine engine rotor's disc problem. One drawback of GATE is a lesser efficiency for highly multimodal unconstrained problems, for which he gave quite poor results with respect to its implementation cost. To conclude, according to the preliminary results obtained during this thesis, the suggested CBR process, combined with GATE, seems to be a very good candidate to automate and accelerate the structural design of mechanical devices, potentially reducing significantly the cost of industrial preliminary design processes.
National meeting to review IPAD status and goals. [Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
A joint NASA/industry project called Integrated Programs for Aerospace-vehicle Design (IPAD) is described, which has the goal of raising aerospace-industry productivity through the application of computers to integrate company-wide management of engineering data. Basically a general-purpose interactive computing system developed to support engineering design processes, the IPAD design is composed of three major software components: the executive, data management, and geometry and graphics software. Results of IPAD activities include a comprehensive description of a future representative aerospace vehicle design process and its interface to manufacturing, and requirements and preliminary design of a future IPAD software system to integrate engineering activities of an aerospace company having several products under simultaneous development.
The service blueprint as a tool for designing innovative pharmaceutical services.
Holdford, D A; Kennedy, D T
1999-01-01
To describe service blueprints, discuss their need and design, and provide examples of their use in advancing pharmaceutical care. Service blueprints are pictures or maps of service processes that permit the people involved in designing, providing, managing, and using the service to better understand them and deal with them objectively. A service blueprint simultaneously depicts the service process and the roles of consumers, service providers, and supporting services. Service blueprints can be useful in pharmacy because many of the obstacles to pharmaceutical care are a result of insufficient planning by service designers and/or poor communication between those designing services and those implementing them. One consequence of this poor design and communication is that many consumers and third party payers are uninformed about pharmacist roles. Service blueprints can be used by pharmacists to promote the value of pharmaceutical care to consumers and other decision makers. They can also assist in designing better pharmaceutical services. Blueprints are designed by identifying and mapping a process from the consumer's point of view, mapping employee actions and support activities, and adding visible evidence of service at each consumer action step. Key components of service blueprints are consumer actions, "onstage" and "backstage" employee actions, and support processes. Blueprints can help pharmacy managers identify and correct problems with the service process, provide pharmacy employees an opportunity to offer feedback in the planning stages of services, and demonstrate the value of pharmaceutical services to consumers. Service blueprints can be a valuable tool for designing, implementing, and evaluating pharmacy services.
A human-oriented framework for developing assistive service robots.
McGinn, Conor; Cullinan, Michael F; Culleton, Mark; Kelly, Kevin
2018-04-01
Multipurpose robots that can perform a range of useful tasks have the potential to increase the quality of life for many people living with disabilities. Owing to factors such as high system complexity, as-yet unresolved research questions and current technology limitations, there is a need for effective strategies to coordinate the development process. Integrating established methodologies based on human-centred design and universal design, a framework was formulated to coordinate the robot design process over successive iterations of prototype development. An account is given of how the framework was practically applied to the problem of developing a personal service robot. Application of the framework led to the formation of several design goals which addressed a wide range of identified user needs. The resultant prototype solution, which consisted of several component elements, succeeded in demonstrating the performance stipulated by all of the proposed metrics. Application of the framework resulted in the development of a complex prototype that addressed many aspects of the functional and usability requirements of a personal service robot. Following the process led to several important insights which directly benefit the development of subsequent prototypes. Implications for Rehabilitation This research shows how universal design might be used to formulate usability requirements for assistive service robots. A framework is presented that guides the process of designing service robots in a human-centred way. Through practical application of the framework, a prototype robot system that addressed a range of identified user needs was developed.
NASA Astrophysics Data System (ADS)
Subara, Deni; Jaswir, Irwandi; Alkhatib, Maan Fahmi Rashid; Noorbatcha, Ibrahim Ali
2018-01-01
The aim of this experiment is to screen and to understand the process variables on the fabrication of fish gelatin nanoparticles by using quality-design approach. The most influencing process variables were screened by using Plackett-Burman design. Mean particles size, size distribution, and zeta potential were found in the range 240±9.76 nm, 0.3, and -9 mV, respectively. Statistical results explained that concentration of acetone, pH of solution during precipitation step and volume of cross linker had a most significant effect on particles size of fish gelatin nanoparticles. It was found that, time and chemical consuming is lower than previous research. This study revealed the potential of quality-by design in understanding the effects of process variables on the fish gelatin nanoparticles production.
NASA Technical Reports Server (NTRS)
Nagaraja, K. S.; Kraft, R. H.
1999-01-01
The HSCT Flight Controls Group has developed longitudinal control laws, utilizing PTC aeroelastic flexible models to minimize aeroservoelastic interaction effects, for a number of flight conditions. The control law design process resulted in a higher order controller and utilized a large number of sensors distributed along the body for minimizing the flexibility effects. Processes were developed to implement these higher order control laws for performing the dynamic gust loads and flutter analyses. The processes and its validation were documented in Reference 2, for selected flight condition. The analytical results for additional flight conditions are presented in this document for further validation.
Kralisch, Dana; Streckmann, Ina; Ott, Denise; Krtschil, Ulich; Santacesaria, Elio; Di Serio, Martino; Russo, Vincenzo; De Carlo, Lucrezia; Linhart, Walter; Christian, Engelbert; Cortese, Bruno; de Croon, Mart H J M; Hessel, Volker
2012-02-13
The simple transfer of established chemical production processes from batch to flow chemistry does not automatically result in more sustainable ones. Detailed process understanding and the motivation to scrutinize known process conditions are necessary factors for success. Although the focus is usually "only" on intensifying transport phenomena to operate under intrinsic kinetics, there is also a large intensification potential in chemistry under harsh conditions and in the specific design of flow processes. Such an understanding and proposed processes are required at an early stage of process design because decisions on the best-suited tools and parameters required to convert green engineering concepts into practice-typically with little chance of substantial changes later-are made during this period. Herein, we present a holistic and interdisciplinary process design approach that combines the concept of novel process windows with process modeling, simulation, and simplified cost and lifecycle assessment for the deliberate development of a cost-competitive and environmentally sustainable alternative to an existing production process for epoxidized soybean oil. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An approach to quantitative sustainability assessment in the early stages of process design.
Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio
2008-06-15
A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.
Real-Time Data Processing Onboard Remote Sensor Platforms: Annual Review #3 Data Package
NASA Technical Reports Server (NTRS)
Cook, Sid; Harsanyi, Joe
2003-01-01
The current program status reviewed by this viewgraph presentation includes: 1) New Evaluation Results; 2) Algorithm Improvement Investigations; 3) Electronic Hardware Design; 4) Software Hardware Interface Design.
Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E
2011-12-01
Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields. Copyright © 2011 Elsevier Ltd. All rights reserved.
Virtual design and construction of plumbing systems
NASA Astrophysics Data System (ADS)
Filho, João Bosco P. Dantas; Angelim, Bruno Maciel; Guedes, Joana Pimentel; de Castro, Marcelo Augusto Farias; Neto, José de Paula Barros
2016-12-01
Traditionally, the design coordination process is carried out by overlaying and comparing 2D drawings made by different project participants. Detecting information errors from a composite drawing is especially challenging and error prone. This procedure usually leaves many design errors undetected until construction begins, and typically lead to rework. Correcting conflict issues, which were not identified during design and coordination phase, reduces the overall productivity for everyone involved in the construction process. The identification of construction issues in the field generate Request for Information (RFIs) that is one of delays causes. The application of Virtual Design and Construction (VDC) tools to the coordination processes can bring significant value to architecture, structure, and mechanical, electrical, and plumbing (MEP) designs in terms of a reduced number of errors undetected and requests for information. This paper is focused on evaluating requests for information (RFI) associated with water/sanitary facilities of a BIM model. Thus, it is expected to add improvements of water/sanitary facility designs, as well as to assist the virtual construction team to notice and identify design problems. This is an exploratory and descriptive research. A qualitative methodology is used. This study adopts RFI's classification in six analyzed categories: correction, omission, validation of information, modification, divergence of information and verification. The results demonstrate VDC's contribution improving the plumbing system designs. Recommendations are suggested to identify and avoid these RFI types in plumbing system design process or during virtual construction.
On processing development for fabrication of fiber reinforced composite, part 2
NASA Technical Reports Server (NTRS)
Hou, Tan-Hung; Hou, Gene J. W.; Sheen, Jeen S.
1989-01-01
Fiber-reinforced composite laminates are used in many aerospace and automobile applications. The magnitudes and durations of the cure temperature and the cure pressure applied during the curing process have significant consequences for the performance of the finished product. The objective of this study is to exploit the potential of applying the optimization technique to the cure cycle design. Using the compression molding of a filled polyester sheet molding compound (SMC) as an example, a unified Computer Aided Design (CAD) methodology, consisting of three uncoupled modules, (i.e., optimization, analysis and sensitivity calculations), is developed to systematically generate optimal cure cycle designs. Various optimization formulations for the cure cycle design are investigated. The uniformities in the distributions of the temperature and the degree with those resulting from conventional isothermal processing conditions with pre-warmed platens. Recommendations with regards to further research in the computerization of the cure cycle design are also addressed.
Canard configured aircraft with 2-D nozzle
NASA Technical Reports Server (NTRS)
Child, R. D.; Henderson, W. P.
1978-01-01
A closely-coupled canard fighter with vectorable two-dimensional nozzle was designed for enhanced transonic maneuvering. The HiMAT maneuver goal of a sustained 8g turn at a free-stream Mach number of 0.9 and 30,000 feet was the primary design consideration. The aerodynamic design process was initiated with a linear theory optimization minimizing the zero percent suction drag including jet effects and refined with three-dimensional nonlinear potential flow techniques. Allowances were made for mutual interference and viscous effects. The design process to arrive at the resultant configuration is described, and the design of a powered 2-D nozzle model to be tested in the LRC 16-foot Propulsion Wind Tunnel is shown.
Promoting Teacher Adoption of GIS Using Teacher-Centered and Teacher-Friendly Design
ERIC Educational Resources Information Center
Hong, Jung Eun
2014-01-01
This article reports the results of a case study that employed user-centered design to develop training tutorials for helping middle school social studies teachers use Web-based GIS in their classrooms. This study placed teachers in the center of the design process in planning, designing, and developing the tutorials. This article describes how…
Talking to Texts and Sketches: The Function of Written and Graphic Mediation in Engineering Design.
ERIC Educational Resources Information Center
Lewis, Barbara
2000-01-01
Describes the author's research that explores the role of language, particularly texts, in the engineering design process. Notes that results of this case study support a new "mediated" model of engineering design as an inventional activity in which designers use talk, written language, and other symbolic representations as tools to think about…
The development of a collaborative virtual environment for finite element simulation
NASA Astrophysics Data System (ADS)
Abdul-Jalil, Mohamad Kasim
Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.
TaN resistor process development and integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Kathleen; Martinez, Marino John; Clevenger, Jascinda
This paper describes the development and implementation of an integrated resistor process based on reactively sputtered tantalum nitride. Image reversal lithography was shown to be a superior method for liftoff patterning of these films. The results of a response surface DOE for the sputter deposition of the films are discussed. Several approaches to stabilization baking were examined and the advantages of the hot plate method are shown. In support of a new capability to produce special-purpose HBT-based Small-Scale Integrated Circuits (SSICs), we developed our existing TaN resistor process, designed for research prototyping, into one with greater maturity and robustness. Includedmore » in this work was the migration of our TaN deposition process from a research-oriented tool to a tool more suitable for production. Also included was implementation and optimization of a liftoff process for the sputtered TaN to avoid the complicating effects of subtractive etching over potentially sensitive surfaces. Finally, the method and conditions for stabilization baking of the resistors was experimentally determined to complete the full implementation of the resistor module. Much of the work to be described involves the migration between sputter deposition tools - from a Kurt J. Lesker CMS-18 to a Denton Discovery 550. Though they use nominally the same deposition technique (reactive sputtering of Ta with N{sup +} in a RF-excited Ar plasma), they differ substantially in their design and produce clearly different results in terms of resistivity, conformity of the film and the difference between as-deposited and stabilized films. We will describe the design of and results from the design of experiments (DOE)-based method of process optimization on the new tool and compare this to what had been used on the old tool.« less
Automation of the CFD Process on Distributed Computing Systems
NASA Technical Reports Server (NTRS)
Tejnil, Ed; Gee, Ken; Rizk, Yehia M.
2000-01-01
A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational resources required to compute and store the information. The scripts were continually modified to improve the utilization of the computational resources and reduce the likelihood of data loss due to failures. An ad-hoc file server was created to manage the large amount of data being generated as part of the design event. Files were stored and retrieved as needed to create new jobs and analyze the results. Additional information is contained in the original.
Study on the Preliminary Design of ARGO-M Operation System
NASA Astrophysics Data System (ADS)
Seo, Yoon-Kyung; Lim, Hyung-Chul; Rew, Dong-Young; Jo, Jung Hyun; Park, Jong-Uk; Park, Eun-Seo; Park, Jang-Hyun
2010-12-01
Korea Astronomy and Space Science Institute has been developing one mobile satellite laser ranging system named as accurate ranging system for geodetic observation-mobile (ARGO-M). Preliminary design of ARGO-M operation system (AOS) which is one of the ARGO-M subsystems was completed in 2009. Preliminary design results are applied to the following development phase by performing detailed design with analysis of pre-defined requirements and analysis of the derived specifications. This paper addresses the preliminary design of the whole AOS. The design results in operation and control part which is a key part in the operation system are described in detail. Analysis results of the interface between operation-supporting hardware and the control computer are summarized, which is necessary in defining the requirements for the operation-supporting hardware. Results of this study are expected to be used in the critical design phase to finalize the design process.
Lakin, Matthew R.; Brown, Carl W.; Horwitz, Eli K.; Fanning, M. Leigh; West, Hannah E.; Stefanovic, Darko; Graves, Steven W.
2014-01-01
The development of large-scale molecular computational networks is a promising approach to implementing logical decision making at the nanoscale, analogous to cellular signaling and regulatory cascades. DNA strands with catalytic activity (DNAzymes) are one means of systematically constructing molecular computation networks with inherent signal amplification. Linking multiple DNAzymes into a computational circuit requires the design of substrate molecules that allow a signal to be passed from one DNAzyme to another through programmed biochemical interactions. In this paper, we chronicle an iterative design process guided by biophysical and kinetic constraints on the desired reaction pathways and use the resulting substrate design to implement heterogeneous DNAzyme signaling cascades. A key aspect of our design process is the use of secondary structure in the substrate molecule to sequester a downstream effector sequence prior to cleavage by an upstream DNAzyme. Our goal was to develop a concrete substrate molecule design to achieve efficient signal propagation with maximal activation and minimal leakage. We have previously employed the resulting design to develop high-performance DNAzyme-based signaling systems with applications in pathogen detection and autonomous theranostics. PMID:25347066
A Formalized Design Process for Bacterial Consortia That Perform Logic Computing
Sun, Rui; Xi, Jingyi; Wen, Dingqiao; Feng, Jingchen; Chen, Yiwei; Qin, Xiao; Ma, Yanrong; Luo, Wenhan; Deng, Linna; Lin, Hanchi; Yu, Ruofan; Ouyang, Qi
2013-01-01
The concept of microbial consortia is of great attractiveness in synthetic biology. Despite of all its benefits, however, there are still problems remaining for large-scaled multicellular gene circuits, for example, how to reliably design and distribute the circuits in microbial consortia with limited number of well-behaved genetic modules and wiring quorum-sensing molecules. To manage such problem, here we propose a formalized design process: (i) determine the basic logic units (AND, OR and NOT gates) based on mathematical and biological considerations; (ii) establish rules to search and distribute simplest logic design; (iii) assemble assigned basic logic units in each logic operating cell; and (iv) fine-tune the circuiting interface between logic operators. We in silico analyzed gene circuits with inputs ranging from two to four, comparing our method with the pre-existing ones. Results showed that this formalized design process is more feasible concerning numbers of cells required. Furthermore, as a proof of principle, an Escherichia coli consortium that performs XOR function, a typical complex computing operation, was designed. The construction and characterization of logic operators is independent of “wiring” and provides predictive information for fine-tuning. This formalized design process provides guidance for the design of microbial consortia that perform distributed biological computation. PMID:23468999
Flight Tests of N.A.C.A. Nose-slot Cowlings on the BFC-1 Airplane
NASA Technical Reports Server (NTRS)
Stickle, George W
1939-01-01
The results of flight tests of four nose-slot cowling designs with several variations in each design are presented. The tests were made in the process of developing the nose-slot cowling. The results demonstrate that a nose-slot cowling may be successfully applied to an airplane and that it utilizes the increased slipstream velocity of low-speed operation to produce increased cooling pressure across the engine. A sample design calculation using results from wind-tunnel, flight, and ground tests is given in an appendix to illustrate the design procedure.
Research on an autonomous vision-guided helicopter
NASA Technical Reports Server (NTRS)
Amidi, Omead; Mesaki, Yuji; Kanade, Takeo
1994-01-01
Integration of computer vision with on-board sensors to autonomously fly helicopters was researched. The key components developed were custom designed vision processing hardware and an indoor testbed. The custom designed hardware provided flexible integration of on-board sensors with real-time image processing resulting in a significant improvement in vision-based state estimation. The indoor testbed provided convenient calibrated experimentation in constructing real autonomous systems.
Zhang, Jian; Zhang, Xin; Bi, Yu-An; Xu, Gui-Hong; Huang, Wen-Zhe; Wang, Zhen-Zhong; Xiao, Wei
2017-09-01
The "design space" method was used to optimize the purification process of Resina Draconis phenol extracts by using the concept of "quality derived from design" (QbD). The content and transfer rate of laurin B and 7,4'-dihydroxyflavone and yield of extract were selected as the critical quality attributes (CQA). Plackett-Burman design showed that the critical process parameters (CPP) were concentration of alkali, the amount of alkali and the temperature of alkali dissolution. Then the Box-Behnken design was used to establish the mathematical model between CQA and CPP. The variance analysis results showed that the P values of the five models were less than 0.05 and the mismatch values were all greater than 0.05, indicating that the model could well describe the relationship between CQA and CPP. Finally, the control limits of the above 5 indicators (content and transfer rate of laurine B and 7,4'-dihydroxyflavone, as well as the extract yield) were set, and then the probability-based design space was calculated by Monte Carlo simulation and verified. The results of the design space validation showed that the optimized purification method can ensure the stability of the Resina Draconis phenol extracts refining process, which would help to improve the quality uniformity between batches of phenol extracts and provide data support for production automation control. Copyright© by the Chinese Pharmaceutical Association.
An empirical evaluation of graphical interfaces to support flight planning
NASA Technical Reports Server (NTRS)
Smith, Philip J.; Mccoy, Elaine; Layton, Chuck; Bihari, Tom
1995-01-01
Whether optimization techniques or expert systems technologies are used, the underlying inference processes and the model or knowledge base for a computerized problem-solving system are likely to be incomplete for any given complex, real-world task. To deal with the resultant brittleness, it has been suggested that 'cooperative' rather than 'automated' problem-solving systems be designed. Such cooperative systems are proposed to explicitly enhance the collaboration of people and the computer system when working in partnership to solve problems. This study evaluates the impact of alternative design concepts on the performance of airline pilots interacting with such a cooperative system designed to support enroute flight planning. Thirty pilots were studied using three different versions of the system. The results clearly demonstrate that different system design concepts can strongly influence the cognitive processes of users. Indeed, one of the designs studied caused four times as many pilots to accept a poor flight amendment. Based on think-aloud protocols, cognitive models are proposed to account for how features of the computer system interacted with specific types of scenarios to influence exploration and decision-making by the pilots. The results are then used to develop recommendations for guiding the design of cooperative systems.
Mathaes, Roman; Mahler, Hanns-Christian; Roggo, Yves; Huwyler, Joerg; Eder, Juergen; Fritsch, Kamila; Posset, Tobias; Mohl, Silke; Streubel, Alexander
2016-01-01
Capping equipment used in good manufacturing practice manufacturing features different designs and a variety of adjustable process parameters. The overall capping result is a complex interplay of the different capping process parameters and is insufficiently described in literature. It remains poorly studied how the different capping equipment designs and capping equipment process parameters (e.g., pre-compression force, capping plate height, turntable rotating speed) contribute to the final residual seal force of a sealed container closure system and its relation to container closure integrity and other drug product quality parameters. Stopper compression measured by computer tomography correlated to residual seal force measurements.In our studies, we used different container closure system configurations from different good manufacturing practice drug product fill & finish facilities to investigate the influence of differences in primary packaging, that is, vial size and rubber stopper design on the capping process and the capped drug product. In addition, we compared two large-scale good manufacturing practice manufacturing capping equipment and different capping equipment settings and their impact on product quality and integrity, as determined by residual seal force.The capping plate to plunger distance had a major influence on the obtained residual seal force values of a sealed vial, whereas the capping pre-compression force and the turntable rotation speed showed only a minor influence on the residual seal force of a sealed vial. Capping process parameters could not easily be transferred from capping equipment of different manufacturers. However, the residual seal force tester did provide a valuable tool to compare capping performance of different capping equipment. No vial showed any leakage greater than 10(-8)mbar L/s as measured by a helium mass spectrometry system, suggesting that container closure integrity was warranted in the residual seal force range tested for the tested container closure systems. Capping equipment used in good manufacturing practice manufacturing features different designs and a variety of adjustable process parameters. The overall capping result is a complex interplay of the different capping process parameters and is insufficiently described in the literature. It remains poorly studied how the different capping equipment designs and capping equipment process parameters contribute to the final capping result.In this study, we used different container closure system configurations from different good manufacturing process drug product fill & finish facilities to investigate the influence of the vial size and the rubber stopper design on the capping process. In addition, we compared two examples of large-scale good manufacturing process capping equipment and different capping equipment settings and their impact on product quality and integrity, as determined by residual seal force. © PDA, Inc. 2016.
SpaceCube v2.0 Space Flight Hybrid Reconfigurable Data Processing System
NASA Technical Reports Server (NTRS)
Petrick, Dave
2014-01-01
This paper details the design architecture, design methodology, and the advantages of the SpaceCube v2.0 high performance data processing system for space applications. The purpose in building the SpaceCube v2.0 system is to create a superior high performance, reconfigurable, hybrid data processing system that can be used in a multitude of applications including those that require a radiation hardened and reliable solution. The SpaceCube v2.0 system leverages seven years of board design, avionics systems design, and space flight application experiences. This paper shows how SpaceCube v2.0 solves the increasing computing demands of space data processing applications that cannot be attained with a standalone processor approach.The main objective during the design stage is to find a good system balance between power, size, reliability, cost, and data processing capability. These design variables directly impact each other, and it is important to understand how to achieve a suitable balance. This paper will detail how these critical design factors were managed including the construction of an Engineering Model for an experiment on the International Space Station to test out design concepts. We will describe the designs for the processor card, power card, backplane, and a mission unique interface card. The mechanical design for the box will also be detailed since it is critical in meeting the stringent thermal and structural requirements imposed by the processing system. In addition, the mechanical design uses advanced thermal conduction techniques to solve the internal thermal challenges.The SpaceCube v2.0 processing system is based on an extended version of the 3U cPCI standard form factor where each card is 190mm x 100mm in size The typical power draw of the processor card is 8 to 10W and scales with application complexity. The SpaceCube v2.0 data processing card features two Xilinx Virtex-5 QV Field Programmable Gate Arrays (FPGA), eight memory modules, a monitor FPGA with analog monitoring, Ethernet, configurable interconnect to the Xilinx FPGAs including gigabit transceivers, and the necessary voltage regulation. The processor board uses a back-to-back design methodology for common parts that maximizes the board real estate available. This paper will show how to meet the IPC 6012B Class 3A standard with a 22-layer board that has two column grid array devices with 1.0mm pitch. All layout trades such as stack-up options, via selection, and FPGA signal breakout will be discussed with feature size results. The overall board design process will be discussed including parts selection, circuit design, proper signal termination, layout placement and route planning, signal integrity design and verification, and power integrity results. The radiation mitigation techniques will also be detailed including configuration scrubbing options, Xilinx circuit mitigation and FPGA functional monitoring, and memory protection.Finally, this paper will describe how this system is being used to solve the extreme challenges of a robotic satellite servicing mission where typical space-rated processors are not sufficient enough to meet the intensive data processing requirements. The SpaceCube v2.0 is the main payload control computer and is required to control critical subsystems such as autonomous rendezvous and docking using a suite of vision sensors and object avoidance when controlling two robotic arms.
ASRM propellant and igniter propellant development and process scale-up
NASA Technical Reports Server (NTRS)
Landers, L. C.; Booth, D. W.; Stanley, C. B.; Ricks, D. W.
1993-01-01
A program of formulation and process development for ANB-3652 motor propellant was conducted to validate design concepts and screen critical propellant composition and process parameters. Design experiments resulted in the selection of a less active grade of ferric oxide to provide better burning rate control, the establishment of AP fluidization conditions that minimized the adverse effects of particle attrition, and the selection of a higher mix temperature to improve mechanical properties. It is shown that the propellant can be formulated with AP and aluminum powder from various producers. An extended duration pilot plant run demonstrated stable equipment operation and excellent reproducibility of propellant properties. A similar program of formulation and process optimization culminating in large batch scaleup was conducted for ANB-3672 igniter propellant. The results for both ANB-3652 and ANB 37672 confirmed that their processing characteristics are compatible with full-scale production.
Development of Crystallizer for Advanced Aqueous Reprocessing Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tadahiro Washiya; Atsuhiro Shibata; Toshiaki Kikuchi
2006-07-01
Crystallization is one of the remarkable technologies for future fuel reprocessing process that has safety and economical advantages. Japan Atomic Energy Agency (JAEA) (former Japan Nuclear Cycle Development Institute), Mitsubishi Material Corporation and Saitama University have been developing the crystallization process. In previous study, we carried out experimental studies with uranium, MOX and spent fuel conditions, and flowsheet analysis was considered. In association with these studies, an innovative continuous crystallizer and its system was developed to ensure high process performance. From the design study, an annular type continuous crystallizer was selected as the most promising design, and performance was confirmedmore » by small-scale test and engineering scale demonstration at uranium crystallization conditions. In this paper, the design study and the demonstration test results are described. (authors)« less
NASA Astrophysics Data System (ADS)
Li, Qing; Lin, Haibo; Xiu, Yu-Feng; Wang, Ruixue; Yi, Chuijie
The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces the whole structure, design parameters of the platform and hardware & software of the image acquisition system were introduced, as well as the method of seed identification and seed-space measurement using image's threshold and counting the seed's center. By analyzing the experimental result, the measurement error is less than ± 1mm.
2009-01-01
Background There are few studies that examine the processes that interdisciplinary teams engage in and how we can design health information systems (HIS) to support those team processes. This was an exploratory study with two purposes: (1) To develop a framework for interdisciplinary team communication based on structures, processes and outcomes that were identified as having occurred during weekly team meetings. (2) To use the framework to guide 'e-teams' HIS design to support interdisciplinary team meeting communication. Methods An ethnographic approach was used to collect data on two interdisciplinary teams. Qualitative content analysis was used to analyze the data according to structures, processes and outcomes. Results We present details for team meta-concepts of structures, processes and outcomes and the concepts and sub concepts within each meta-concept. We also provide an exploratory framework for interdisciplinary team communication and describe how the framework can guide HIS design to support 'e-teams'. Conclusion The structures, processes and outcomes that describe interdisciplinary teams are complex and often occur in a non-linear fashion. Electronic data support, process facilitation and team video conferencing are three HIS tools that can enhance team function. PMID:19754966
Montague, Enid; Mohr, David C
2013-01-01
Background To our knowledge, there is no well-articulated process for the design of culturally informed behavioral intervention technologies. Objective This paper describes the early stages of such a process, illustrated by the methodology for the ongoing development of a behavioral intervention technology targeting generalized anxiety disorder and major depression among young sexual minority men. Methods We integrated instructional design for Internet behavioral intervention technologies with greater detail on information sources that can identify user needs in understudied populations, as well as advances in the understanding of technology-specific behavioral intervention technology dimensions that may need to be culturally tailored. Results General psychological theory describing how to effect change in the clinical target is first integrated with theory describing potentially malleable factors that help explain the clinical problem within the population. Additional information sources are then used to (1) evaluate the theory, (2) identify population-specific factors that may affect users’ ability to relate to and benefit from the behavioral intervention technology, and (3) establish specific skills, attitudes, knowledge, etc, required to change malleable factors posited in the theory. User needs result from synthesis of this information. Product requirements are then generated through application of the user needs to specific behavioral intervention technology dimensions (eg, technology platform). We provide examples of considerations relevant to each stage of this process and how they were applied. Conclusions This process can guide the initial design of other culturally informed behavioral intervention technologies. This first attempt to create a systematic design process can spur development of guidelines for design of behavioral intervention technologies aimed to reduce health disparities. PMID:24311444
Knowledge Interaction Design for Creative Knowledge Work
NASA Astrophysics Data System (ADS)
Nakakoji, Kumiyo; Yamamoto, Yasuhiro
This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.
Redesigning the continuous vacuum sealer packaging machine to improve the processing speed
NASA Astrophysics Data System (ADS)
Belo, J. B.; Widyanto, S. A.; Jamari, J.
2017-01-01
Vacuum sealer as a product packaging tool of food products to be able to vacuum air inside the plastic which is filled with food products and it causes the pressure lower. In this condition, the optimal heating temperature is reached in a shorter time, so that damage on plastic sealer of vacuumed food products could be prevented to be more effective and efficient. The purpose of this redesigning is to design a vacuum sealer packaging machine continuously through a conveyor mechanism on the packaging quality, time of processing speed of vacuuming food product in the plastic package. This designing process is conducted through several steps of designing and constructing tools until the products are ready to operate. Data analysis is done through quality test of vacuum and sealer to the plastic thickness of 75 µm, 80 µm, and 100 µm with temperature of 170°C, 180°C, 190°C and vacuum duration of 5 seconds, 8 seconds, and 60 seconds. Results of this designing process indicate that vacuum sealer works practically and more optimally with the time of vacuum processing speed of 0 to 1 minute/s; whereas, the pressure of vacuuming suction is until 1e-5 MPa. The results of tensile strength test are at a maximum of 32,796 (N/mm2) and a minimum of 20,155 (N/mm2) and the analysis of plastic composite with EDX. This result shows that the vacuum pressure and the quality of vacuum sealer are better and more efficient.
Finding the right way: DFM versus area efficiency for 65 nm gate layer lithography
NASA Astrophysics Data System (ADS)
Sarma, Chandra S.; Scheer, Steven; Herold, Klaus; Fonseca, Carlos; Thomas, Alan; Schroeder, Uwe P.
2006-03-01
DFM (Design for Manufacturing) has become a buzzword for lithography since the 90nm node. Implementing DFM intelligently can boost yield rates and reliability in semiconductor manufacturing significantly. However, any restriction on the design space will always result in an area loss, thus diminishing the effective shrink factor for a given technology. For a lithographer, the key task is to develop a manufacturable process, while not sacrificing too much area. We have developed a high performing lithography process for attenuated gate level lithography that is based on aggressive illumination and a newly optimized SRAF placement schemes. In this paper we present our methodology and results for this optimization, using an anchored simulation model. The wafer results largely confirm the predictions of the simulations. The use of aggressive SRAF (Sub Resolution Assist Features) strategy leads to reduction of forbidden pitch regions without any SRAF printing. The data show that our OPC is capable of correcting the PC tip to tip distance without bridging between the tips in dense SRAM cells. SRAF strategy for various 2D cases has also been verified on wafer. We have shown that aggressive illumination schemes yielding a high performing lithography process can be employed without sacrificing area. By carefully choosing processing conditions, we were able develop a process that has very little restrictions for design. In our approach, the remaining issues can be addressed by DFM, partly in data prep procedures, which are largely area neutral and transparent to the designers. Hence, we have shown successfully, that DFM and effective technology shrinks are not mutually exclusive.
NASA Technical Reports Server (NTRS)
Benzie, M. A.
1998-01-01
The objective of this research project was to examine processing and design parameters in the fabrication of composite components to obtain a better understanding and attempt to minimize springback associated with composite materials. To accomplish this, both processing and design parameters were included in a Taguchi-designed experiment. Composite angled panels were fabricated, by hand layup techniques, and the fabricated panels were inspected for springback effects. This experiment yielded several significant results. The confirmation experiment validated the reproducibility of the factorial effects, error recognized, and experiment as reliable. The material used in the design of tooling needs to be a major consideration when fabricating composite components, as expected. The factors dealing with resin flow, however, raise several potentially serious material and design questions. These questions must be dealt with up front in order to minimize springback: viscosity of the resin, vacuum bagging of the part for cure, and the curing method selected. These factors directly affect design, material selection, and processing methods.
Simulation models and designs for advanced Fischer-Tropsch technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, G.N.; Kramer, S.J.; Tam, S.S.
1995-12-31
Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for themore » products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.« less
A review of parametric approaches specific to aerodynamic design process
NASA Astrophysics Data System (ADS)
Zhang, Tian-tian; Wang, Zhen-guo; Huang, Wei; Yan, Li
2018-04-01
Parametric modeling of aircrafts plays a crucial role in the aerodynamic design process. Effective parametric approaches have large design space with a few variables. Parametric methods that commonly used nowadays are summarized in this paper, and their principles have been introduced briefly. Two-dimensional parametric methods include B-Spline method, Class/Shape function transformation method, Parametric Section method, Hicks-Henne method and Singular Value Decomposition method, and all of them have wide application in the design of the airfoil. This survey made a comparison among them to find out their abilities in the design of the airfoil, and the results show that the Singular Value Decomposition method has the best parametric accuracy. The development of three-dimensional parametric methods is limited, and the most popular one is the Free-form deformation method. Those methods extended from two-dimensional parametric methods have promising prospect in aircraft modeling. Since different parametric methods differ in their characteristics, real design process needs flexible choice among them to adapt to subsequent optimization procedure.
Novel Round Energy Director for Use with Servo-driven Ultrasonic Welder
NASA Astrophysics Data System (ADS)
Savitski, Alex; Klinstein, Leo; Holt, Kenneth
Increasingly stringent process repeatability and precision of assembly requirements are common for high-volume manufacturing for electronic, automotive and especially medical device industries, in which components for disposable medication delivery devices are produced in hundreds of millions annually. Ultrasonic welding, one of the most efficient of plastic welding processes often joins these small plastic parts together, and quite possibly, the one most broadly adopted for high volume assembly. The very fundamental factor in ultrasonic welding process performance is a proper joint design, the most common of which is a design utilizing an energy director. Keeping the energy director size and shape consistent on a part-to-part basis in high volume, multi-cavity operations presents a constant challenge to molded part vendors, as dimensional variations from cavity to cavity and variations in the molding process are always present. A newly developed concept of energy director design, when the tip of the energy director is round, addresses these problems, as the round energy director is significantly easier to mold and maintain its dimensional consistency. It also eliminates a major source of process variability for assembly operations. Materializing the benefits of new type of joint design became possible with the introduction of servo-driven ultrasonic welders, which allow an unprecedented control of material flow during the welding cycle and results in significantly improved process repeatability. This article summarizes results of recent studies focused on evaluating performance of round energy director and investigating the main factors responsible for the joint quality.
The Goal-Based Scenario Builder: Experiences with Novice Instructional Designers.
ERIC Educational Resources Information Center
Bell, Benjamin; Korcuska, Michael
Creating educational software generally requires a great deal of computer expertise, and as a result, educators lacking such knowledge have largely been excluded from the design process. Recently, researchers have been designing tools for automating some aspects of building instructional applications. These tools typically aim for generality,…
Integrated Design Tools Reduce Risk, Cost
NASA Technical Reports Server (NTRS)
2012-01-01
Thanks in part to a SBIR award with Langley Research Center, Phoenix Integration Inc., based in Wayne, Pennsylvania, modified and advanced software for process integration and design automation. For NASA, the tool has resulted in lower project costs and reductions in design time; clients of Phoenix Integration are experiencing the same rewards.
Scandurra, Isabella; Hägglund, Maria
2009-01-01
Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].
Design and fabrication of a freeform phase plate for high-order ocular aberration correction
NASA Astrophysics Data System (ADS)
Yi, Allen Y.; Raasch, Thomas W.
2005-11-01
In recent years it has become possible to measure and in some instances to correct the high-order aberrations of human eyes. We have investigated the correction of wavefront error of human eyes by using phase plates designed to compensate for that error. The wavefront aberrations of the four eyes of two subjects were experimentally determined, and compensating phase plates were machined with an ultraprecision diamond-turning machine equipped with four independent axes. A slow-tool servo freeform trajectory was developed for the machine tool path. The machined phase-correction plates were measured and compared with the original design values to validate the process. The position of the phase-plate relative to the pupil is discussed. The practical utility of this mode of aberration correction was investigated with visual acuity testing. The results are consistent with the potential benefit of aberration correction but also underscore the critical positioning requirements of this mode of aberration correction. This process is described in detail from optical measurements, through machining process design and development, to final results.
NASA Astrophysics Data System (ADS)
Whitehouse, C. R.; Barnett, S. J.; Soley, D. E. J.; Quarrell, J.; Aldridge, S. J.; Cullis, A. G.; Emeny, M. T.; Johnson, A. D.; Clarke, G. F.; Lamb, W.; Tanner, B. K.; Cottrell, S.; Lunn, B.; Hogg, C.; Hagston, W.
1992-01-01
This paper describes a unique combined UHV MBE growth x-ray topography facility designed to allow the first real-time synchrotron radiation x-ray topography study of strained-layer III-V growth processes. This system will enable unambiguous determination of dislocation nucleation and multiplication processes as a function of controlled variations in growth conditions, and also during post-growth thermal processing. The planned experiments have placed very stringent demands upon the engineering design of the system, and design details regarding the growth chamber; sample manipulator, x-ray optics, and real-time imaging systems are described. Results obtained during a feasibility study are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehouse, C.R.; Barnett, S.J.; Soley, D.E.J.
1992-01-01
This paper describes a unique combined UHV MBE growth x-ray topography facility designed to allow the first real-time synchrotron radiation x-ray topography study of strained-layer III--V growth processes. This system will enable unambiguous determination of dislocation nucleation and multiplication processes as a function of controlled variations in growth conditions, and also during post-growth thermal processing. The planned experiments have placed very stringent demands upon the engineering design of the system, and design details regarding the growth chamber; sample manipulator, x-ray optics, and real-time imaging systems are described. Results obtained during a feasibility study are also presented.
Optimization of rotor shaft shrink fit method for motor using "Robust design"
NASA Astrophysics Data System (ADS)
Toma, Eiji
2018-01-01
This research is collaborative investigation with the general-purpose motor manufacturer. To review construction method in production process, we applied the parameter design method of quality engineering and tried to approach the optimization of construction method. Conventionally, press-fitting method has been adopted in process of fitting rotor core and shaft which is main component of motor, but quality defects such as core shaft deflection occurred at the time of press fitting. In this research, as a result of optimization design of "shrink fitting method by high-frequency induction heating" devised as a new construction method, its construction method was feasible, and it was possible to extract the optimum processing condition.
Prediction and Estimation of Scaffold Strength with different pore size
NASA Astrophysics Data System (ADS)
Muthu, P.; Mishra, Shubhanvit; Sri Sai Shilpa, R.; Veerendranath, B.; Latha, S.
2018-04-01
This paper emphasizes the significance of prediction and estimation of the mechanical strength of 3D functional scaffolds before the manufacturing process. Prior evaluation of the mechanical strength and structural properties of the scaffold will reduce the cost fabrication and in fact ease up the designing process. Detailed analysis and investigation of various mechanical properties including shear stress equivalence have helped to estimate the effect of porosity and pore size on the functionality of the scaffold. The influence of variation in porosity was examined by computational approach via finite element analysis (FEA) and ANSYS application software. The results designate the adequate perspective of the evolutionary method for the regulation and optimization of the intricate engineering design process.
NASA Astrophysics Data System (ADS)
Tuzkaya, Umut R.; Eser, Arzum; Argon, Goner
2004-02-01
Today, growing amounts of waste due to fast consumption rate of products started an irreversible environmental pollution and damage. A considerable part of this waste is caused by packaging material. With the realization of this fact, various waste policies have taken important steps. Here we considered a firm, where waste Aluminum constitutes majority of raw materials for this fir0m. In order to achieve a profitable recycling process, plant layout should be well designed. In this study, we propose a two-step approach involving Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) to solve facility layout design problems. A case example is considered to demonstrate the results achieved.
Cell design and manufacturing changes during the past decade
NASA Technical Reports Server (NTRS)
Baer, D. A.
1978-01-01
Eight of the most important changes that occurred in the GE 12 AH cell over the past ten years, which are currently being used are evaluated, and a systematic approach to compare their relative merits is presented. Typical positive thickness, typical negative thickness, positive loading, negative loading, final KOH quantity, and precharge as adjustment are shown for the control cell, and the following variables: Teflon treatment; silver treatment; light loading; no PQ treatment; polypropylene separator; the A.K. 1968 plate design no PQ, old elec process, no decarb process and the A.K. 1968 plate design, no PQ, present aerospace processes. The acceptance test cell voltage and cell pressure performance and capacity test results are included.
Towards Zero-Waste Furniture Design.
Koo, Bongjin; Hergel, Jean; Lefebvre, Sylvain; Mitra, Niloy J
2017-12-01
In traditional design, shapes are first conceived, and then fabricated. While this decoupling simplifies the design process, it can result in unwanted material wastage, especially where off-cut pieces are hard to reuse. In absence of explicit feedback on material usage, the designer remains helpless to effectively adapt the design - even when design variabilities exist. We investigate waste minimizing furniture design wherein based on the current design, the user is presented with design variations that result in less wastage of materials. Technically, we dynamically analyze material space layout to determine which parts to change and how , while maintaining original design intent specified in the form of design constraints. We evaluate the approach on various design scenarios, and demonstrate effective material usage that is difficult, if not impossible, to achieve without computational support.
Michaels-Igbokwe, Christine; Lagarde, Mylene; Cairns, John; Terris-Prestholt, Fern
2014-03-01
The process of designing and developing discrete choice experiments (DCEs) is often under reported. The need to adequately report the results of qualitative work used to identify attributes and levels used in a DCE is recognised. However, one area that has received relatively little attention is the exploration of the choice question of interest. This paper provides a case study of the process used to design a stated preference survey to assess youth preferences for integrated sexual and reproductive health (SRH) and HIV outreach services in Malawi. Development and design consisted of six distinct but overlapping and iterative stages. Stage one was a review of the literature. Stage two involved developing a decision map to conceptualise the choice processes involved. Stage three included twelve focus group discussions with young people aged 15-24 (n = 113) and three key informant interviews (n = 3) conducted in Ntcheu District, Malawi. Stage four involved analysis of qualitative data and identification of potential attributes and levels. The choice format and experimental design were selected in stages five and six. The results of the literature review were used to develop a decision map outlining the choices that young people accessing SRH services may face. For youth that would like to use services two key choices were identified: the choice between providers and the choice of service delivery attributes within a provider type. Youth preferences for provider type are best explored using a DCE with a labelled design, while preferences for service delivery attributes associated with a particular provider are better understood using an unlabelled design. Consequently, two DCEs were adopted to jointly assess preferences in this context. Used in combination, the results of the literature review, the decision mapping process and the qualitative work provided robust approach to designing the DCEs individually and as complementary pieces of work. Copyright © 2014 Elsevier Ltd. All rights reserved.
Simulation-Driven Design Approach for Design and Optimization of Blankholder
NASA Astrophysics Data System (ADS)
Sravan, Tatipala; Suddapalli, Nikshep R.; Johan, Pilthammar; Mats, Sigvant; Christian, Johansson
2017-09-01
Reliable design of stamping dies is desired for efficient and safe production. The design of stamping dies are today mostly based on casting feasibility, although it can also be based on criteria for fatigue, stiffness, safety, economy. Current work presents an approach that is built on Simulation Driven Design, enabling Design Optimization to address this issue. A structural finite element model of a stamping die, used to produce doors for Volvo V70/S80 car models, is studied. This die had developed cracks during its usage. To understand the behaviour of stress distribution in the stamping die, structural analysis of the die is conducted and critical regions with high stresses are identified. The results from structural FE-models are compared with analytical calculations pertaining to fatigue properties of the material. To arrive at an optimum design with increased stiffness and lifetime, topology and free-shape optimization are performed. In the optimization routine, identified critical regions of the die are set as design variables. Other optimization variables are set to maintain manufacturability of the resultant stamping die. Thereafter a CAD model is built based on geometrical results from topology and free-shape optimizations. Then the CAD model is subjected to structural analysis to visualize the new stress distribution. This process is iterated until a satisfactory result is obtained. The final results show reduction in stress levels by 70% with a more homogeneous distribution. Even though mass of the die is increased by 17 %, overall, a stiffer die with better lifetime is obtained. Finally, by reflecting on the entire process, a coordinated approach to handle such situations efficiently is presented.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2005-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2004-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
A Taguchi study of the aeroelastic tailoring design process
NASA Technical Reports Server (NTRS)
Bohlmann, Jonathan D.; Scott, Robert C.
1991-01-01
A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.
Design of the storage location based on the ABC analyses
NASA Astrophysics Data System (ADS)
Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel
2016-06-01
The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Neil, D.J.; Colcord, A.R.; Bery, M.K.
The objective of this project is to design, fabricate, and operate a fermentation facility which will demonstrate on a pilot-scale level (3 oven-dry tons (ODT) per day of feedstock) the economic and technical feasibility of producing anhydrous ethyl alcohol from lignocellulosic biomass residues (wood, corn stover, and wheat straw principally). The resultant process development unit (PDU) will be flexibly designed so as to evaluate current and projected unit operations, materials of construction, chemical and enzymatic systems which offer the potential of significant technological and economic breakthroughs in alcohol production from biomass. The principal focus of the project is to generatemore » fuels from biomass. As such, in addition to alcohol which can be used as a transportation fuel, by-products are to be directed where possible to fuel applications. The project consists of two parts: (1) conceptual design, and (2) detailed engineering design. The first quarter's activities have focused on a critical review of several aspects of the conceptual design of the 3 ODT/day PDU, viz.: (1) biomass cost, availability, and characterization; (2) pretreatment processes for lignocellulosic residues; (3) hydrolytic processes (enzymatic and acidic); (4) fermentation processes; (5) alcohol recovery systems; (6) by-product streams utilization; and (7) process economics.« less
A noninterference blade vibration measurement system for gas turbine engines
NASA Astrophysics Data System (ADS)
Watkins, William B.; Chi, Ray M.
1987-06-01
A noninterfering blade vibration system has been demonstrated in tests of a gas turbine first stage fan. Conceptual design of the system, including its theory, design of case mounted probes, and data acquisition and signal processing hardware was done in a previous effort. The current effort involved instrumentation of an engine fan stage with strain gages; data acquisition using shaft-mounted reference and case-mounted optical probes; recording of data on a wideband tape recorder; and posttest processing using off-line analysis in a facility computer and a minicomputer-based readout system designed for near- real-time readout. Results are presented in terms of true blade vibration frequencies, time and frequency dependent vibration amplitudes and comparison of the optical noninterference results with strain gage readings.
User-centric design of a personal assistance robot (FRASIER) for active aging.
Padir, Taşkin; Skorinko, Jeanine; Dimitrov, Velin
2015-01-01
We present our preliminary results from the design process for developing the Worcester Polytechnic Institute's personal assistance robot, FRASIER, as an intelligent service robot for enabling active aging. The robot capabilities include vision-based object detection, tracking the user and help with carrying heavy items such as grocery bags or cafeteria trays. This work-in-progress report outlines our motivation and approach to developing the next generation of service robots for the elderly. Our main contribution in this paper is the development of a set of specifications based on the adopted user-centered design process, and realization of the prototype system designed to meet these specifications.
Design and calibration of the carousel wind tunnel
NASA Technical Reports Server (NTRS)
Leach, R. N.; Greeley, R.; Iversen, J.; White, B.; Marshall, J. R.
1986-01-01
In the study of planetary aeolian processes the effect of gravity is not readily modeled. Gravity appears in the equations of particle motion along with interparticle forces but the two terms are not separable. A wind tunnel that would permit variable gravity would allow separation of the forces and aid greatly in understanding planetary aeolian processes. The design Carousel Wind Tunnel (CWT) allows for a long flow distance in a small sized tunnel since the test section is a continuo us circuit and allows for a variable pseudo gravity. A prototype design was built and calibrated to gain some understanding of the characteristics of the design and the results presented.
Design and calibration of the carousel wind tunnel
NASA Technical Reports Server (NTRS)
Leach, R. N.; Greeley, Ronald; Iversen, James D.; White, Bruce R.; Marshall, John R.
1987-01-01
In the study of planetary aeolian processes the effect of gravity is not readily modeled. Gravity appears in the equations of particle motion along with interparticle forces but the two terms are not separable. A wind tunnel that would permit variable gravity would allow separation of the forces and aid greatly in understanding planetary aeolian processes. The design of the Carousel Wind Tunnel (CWT) allows for a long flow distance in a small sized tunnel since the test section is a continuous circuit and allows for a variable pseudo-gravity. A prototype design was built and calibrated to gain some understanding of the characteristics of the design and the results presented.
A process for prototyping onboard payload displays for Space Station Freedom
NASA Technical Reports Server (NTRS)
Moore, Loretta A.
1992-01-01
Significant advances have been made in the area of Human-Computer Interface design. However, there is no well-defined process for going from user interface requirements to user interface design. Developing and designing a clear and consistent user interface for medium to large scale systems is a very challenging and complex task. The task becomes increasingly difficult when there is very little guidance and procedures on how the development process should flow from one stage to the next. Without a specific sequence of development steps each design becomes difficult to repeat, to evaluate, to improve, and to articulate to others. This research contributes a process which identifies the phases of development and products produced as a result of each phase for a rapid prototyping process to be used to develop requirements for the onboard payload displays for Space Station Freedom. The functional components of a dynamic prototyping environment in which this process can be carried out is also discussed. Some of the central questions which are answered here include: How does one go from specifications to an actual prototype? How is a prototype evaluated? How is usability defined and thus measured? How do we use the information from evaluation in redesign of an interface? and Are there techniques which allow for convergence on a design?
NASA Technical Reports Server (NTRS)
Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.
2008-01-01
To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.
Multi-Attribute Tradespace Exploration in Space System Design
NASA Astrophysics Data System (ADS)
Ross, A. M.; Hastings, D. E.
2002-01-01
The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.
PRECIPITATION CHEMISTRY OF MAGNESIUM SULFITE HYDRATES IN MAGNESIUM OXIDE SCRUBBING
The report gives results of laboratory studies defining the precipitation chemistry of MgSO3 hydrates. The results apply to the design of Mg-based scrubbing processes for SO2 removal from combustion flue gas. In Mg-based scrubbing processes, MgSO3 precipitates as either trihydrat...
Conceptual design of single turbofan engine powered light aircraft
NASA Technical Reports Server (NTRS)
Snyder, F. S.; Voorhees, C. G.; Heinrich, A. M.; Baisden, D. N.
1977-01-01
The conceptual design of a four place single turbofan engine powered light aircraft was accomplished utilizing contemporary light aircraft conventional design techniques as a means of evaluating the NASA-Ames General Aviation Synthesis Program (GASP) as a preliminary design tool. In certain areas, disagreement or exclusion were found to exist between the results of the conventional design and GASP processes. Detail discussion of these points along with the associated contemporary design methodology are presented.
Reusable Launch Vehicle Tank/Intertank Sizing Trade Study
NASA Technical Reports Server (NTRS)
Dorsey, John T.; Myers, David E.; Martin, Carl J.
2000-01-01
A tank and intertank sizing tool that includes effects of major design drivers, and which allows parametric studies to be performed, has been developed and calibrated against independent representative results. Although additional design features, such as bulkheads and field joints, are not currently included in the process, the improved level of fidelity has allowed parametric studies to be performed which have resulted in understanding of key tank and intertank design drivers, design sensitivities, and definition of preferred design spaces. The sizing results demonstrated that there were many interactions between the configuration parameters of internal/external payload, vehicle fineness ratio (half body angle), fuel arrangement (LOX-forward/LOX-aft), number of tanks, and tank shape/arrangement (number of lobes).
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
WaveJava: Wavelet-based network computing
NASA Astrophysics Data System (ADS)
Ma, Kun; Jiao, Licheng; Shi, Zhuoer
1997-04-01
Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.
NASA Astrophysics Data System (ADS)
Srinivasagupta, Deepak; Kardos, John L.
2004-05-01
Injected pultrusion (IP) is an environmentally benign continuous process for low-cost manufacture of prismatic polymer composites. IP has been of recent regulatory interest as an option to achieve significant vapour emissions reduction. This work describes the design of the IP process with multiple design objectives. In our previous work (Srinivasagupta D et al 2003 J. Compos. Mater. at press), an algorithm for economic design using a validated three-dimensional physical model of the IP process was developed, subject to controllability considerations. In this work, this algorithm was used in a multi-objective optimization approach to simultaneously meet economic, quality related, and environmental objectives. The retrofit design of a bench-scale set-up was considered, and the concept of exergy loss in the process, as well as in vapour emission, was introduced. The multi-objective approach was able to determine the optimal values of the processing parameters such as heating zone temperatures and resin injection pressure, as well as the equipment specifications (die dimensions, heater, puller and pump ratings) that satisfy the various objectives in a weighted sense, and result in enhanced throughput rates. The economic objective did not coincide with the environmental objective, and a compromise became necessary. It was seen that most of the exergy loss is in the conversion of electric power into process heating. Vapour exergy loss was observed to be negligible for the most part.
ERIC Educational Resources Information Center
McKee, Richard Lee
This master's thesis reports the results of a survey submitted to over 30 colleges and universities that currently offer computer graphics courses or are in the planning stage of curriculum design. Intended to provide a profile of the computer graphics programs and insight into the process of curriculum design, the survey gathered data on program…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liese, Eric; Zitney, Stephen E.
A multi-stage centrifugal compressor model is presented with emphasis on analyzing use of an exit flow coefficient vs. an inlet flow coefficient performance parameter to predict off-design conditions in the critical region of a supercritical carbon dioxide (CO 2) power cycle. A description of the performance parameters is given along with their implementation in a design model (number of stages, basic sizing, etc.) and a dynamic model (for use in transient studies). A design case is shown for two compressors, a bypass compressor and a main compressor, as defined in a process simulation of a 10 megawatt (MW) supercritical COmore » 2 recompression Brayton cycle. Simulation results are presented for a simple open cycle and closed cycle process with changes to the inlet temperature of the main compressor which operates near the CO 2 critical point. Results showed some difference in results using the exit vs. inlet flow coefficient correction, however, it was not significant for the range of conditions examined. Here, this paper also serves as a reference for future works, including a full process simulation of the 10 MW recompression Brayton cycle.« less
Moraes, A S P; Arezes, P M; Vasconcelos, R
2012-01-01
The development of ergonomics' recommendations, guidelines and standards are attempts to promote the integration of ergonomics into industrial contexts. Such developments result from several sources and professionals and represent the effort that has been done to develop healthier and safer work environments. However, the availability of large amount of data and documents regarding ergonomics does not guarantee their applicability. The main goal of this paper is to use a specific case to demonstrate how ergonomics criteria were developed in order to contribute to the design of workplaces. Based on the obtained results from research undertaken in a tire company, it was observed that the ergonomics criteria should be presented as design specifications in order to be used by engineers and designers. In conclusion, it is observed that the multiple constraint environment impeded the appliance of the ergonomics criteria. It was also observed that the knowledge on technical design and the acquaintance with ergonomic standards, the level of integration in the design team, and the ability to communicate with workers and other technical staff have paramount importance in integrating ergonomics criteria into the design process.
NASA Astrophysics Data System (ADS)
Qin, Xunpeng; Gao, Kai; Zhu, Zhenhua; Chen, Xuliang; Wang, Zhou
2017-09-01
The spot continual induction hardening (SCIH) process, which is a modified induction hardening, can be assembled to a five-axis cooperating computer numerical control machine tool to strengthen more than one small area or relatively large area on complicated component surface. In this study, a response surface method was presented to optimize phase transformation region after the SCIH process. The effects of five process parameters including feed velocity, input power, gap, curvature and flow rate on temperature, microstructure, microhardness and phase transformation geometry were investigated. Central composition design, a second-order response surface design, was employed to systematically estimate the empirical models of temperature and phase transformation geometry. The analysis results indicated that feed velocity has a dominant effect on the uniformity of microstructure and microhardness, domain size, oxidized track width, phase transformation width and height in the SCIH process while curvature has the largest effect on center temperature in the design space. The optimum operating conditions with 0.817, 0.845 and 0.773 of desirability values are expected to be able to minimize ratio (tempering region) and maximize phase transformation width for concave, flat and convex surface workpieces, respectively. The verification result indicated that the process parameters obtained by the model were reliable.
Modeling and Simulation for Mission Operations Work System Design
NASA Technical Reports Server (NTRS)
Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.
2003-01-01
Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.
Building configuration and seismic design: The architecture of earthquake resistance
NASA Astrophysics Data System (ADS)
Arnold, C.; Reitherman, R.; Whitaker, D.
1981-05-01
The architecture of a building in relation to its ability to withstand earthquakes was determined. Aspects of round motion which are significant to building behavior are discussed. Results of a survey of configuration decisions that affect the performance of buildings with a focus on the architectural aspects of configuration design are provided. Configuration derivation, building type as it relates to seismic design, and seismic design, and seismic issues in the design process are examined. Case studies of the Veterans' Administration Hospital in Loma Linda, California, and the Imperial Hotel in Tokyo, Japan, are presented. The seismic design process is described paying special attention to the configuration issues. The need is stressed for guidelines, codes, and regulations to ensure design solutions that respect and balance the full range of architectural, engineering, and material influences on seismic hazards.
Modeling, simulation, and control of an extraterrestrial oxygen production plant
NASA Technical Reports Server (NTRS)
Schooley, L.; Cellier, F.; Zeigler, B.; Doser, A.; Farrenkopf, G.
1991-01-01
The immediate objective is the development of a new methodology for simulation of process plants used to produce oxygen and/or other useful materials from local planetary resources. Computer communication, artificial intelligence, smart sensors, and distributed control algorithms are being developed and implemented so that the simulation or an actual plant can be controlled from a remote location. The ultimate result of this research will provide the capability for teleoperation of such process plants which may be located on Mars, Luna, an asteroid, or other objects in space. A very useful near-term result will be the creation of an interactive design tool, which can be used to create and optimize the process/plant design and the control strategy. This will also provide a vivid, graphic demonstration mechanism to convey the results of other researchers to the sponsor.
Teżyk, Michał; Jakubowska, Emilia; Milanowski, Bartłomiej; Lulek, Janina
2017-10-01
The aim of this study was to optimize the process of tablets compression and identification of film-coating critical process parameters (CPPs) affecting critical quality attributes (CQAs) using quality by design (QbD) approach. Design of experiment (DOE) and regression methods were employed to investigate hardness, disintegration time, and thickness of uncoated tablets depending on slugging and tableting compression force (CPPs). Plackett-Burman experimental design was applied to identify critical coating process parameters among selected ones that is: drying and preheating time, atomization air pressure, spray rate, air volume, inlet air temperature, and drum pressure that may influence the hardness and disintegration time of coated tablets. As a result of the research, design space was established to facilitate an in-depth understanding of existing relationship between CPPs and CQAs of intermediate product (uncoated tablets). Screening revealed that spray rate and inlet air temperature are two most important factors that affect the hardness of coated tablets. Simultaneously, none of the tested coating factors have influence on disintegration time. The observation was confirmed by conducting film coating of pilot size batches.
Finite element analysis as a design tool for thermoplastic vulcanizate glazing seals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gase, K.M.; Hudacek, L.L.; Pesevski, G.T.
1998-12-31
There are three materials that are commonly used in commercial glazing seals: EPDM, silicone and thermoplastic vulcanizates (TPVs). TPVs are a high performance class of thermoplastic elastomers (TPEs), where TPEs have elastomeric properties with thermoplastic processability. TPVs have emerged as materials well suited for use in glazing seals due to ease of processing, economics and part design flexibility. The part design and development process is critical to ensure that the chosen TPV provides economics, quality and function in demanding environments. In the design and development process, there is great value in utilizing dual durometer systems to capitalize on the benefitsmore » of soft and rigid materials. Computer-aided design tools, such as Finite Element Analysis (FEA), are effective in minimizing development time and predicting system performance. Examples of TPV glazing seals will illustrate the benefits of utilizing FEA to take full advantage of the material characteristics, which results in functional performance and quality while reducing development iterations. FEA will be performed on two glazing seal profiles to confirm optimum geometry.« less
NASA Astrophysics Data System (ADS)
Mansur, A.; Janari dan, D.; Setiawan, N.
2016-02-01
Biofuel is developed as an alternative source of second generation energy that could be attained from organic waste. This research is purposed to create applicative and cheap Portable digester unit for society. The design concepts’ screening that was made under considerations of the experts is finally resumed. Design 1 with final weight score of 1, design 2 with final weight score of -1, design 3 with final weight score of 2, design 4 with final weight score 3, design 5 with final weight score of -1, design 6 with final weight score of 0. Accepted designs for further concept assessment are design 1, 2 and 6. The result of concept assessment applies weighting for the scoring. Design 1 resulting 2.67, design 2 results 2.15 while design 3 results 2.52. Design 1 is concluded as the design with biggest result, which is 2.67. Its specification is explained as follows: tank capacity of 60 liters, manual rotating crank pivot, tank's material is plastic with symbol 1, material of axle swivel arm is grey cast iron, 2 mm rotary blades with hole. The experiment 1 contained 23.78% methane and 13.65 carbon dioxide that resulted from content test.
Overview of the Integrated Programs for Aerospace Vehicle Design (IPAD) project
NASA Technical Reports Server (NTRS)
Venneri, S. L.
1983-01-01
To respond to national needs for improved productivity in engineering design and manufacturing, a NASA supported joint industry/government project is underway denoted Integrated Programs for Aerospace Vehicle Design (IPAD). The objective is to improve engineering productivity through better use of computer technology. It focuses on development of data base management technology and associated software for integrated company wide management of engineering and manufacturing information. Results to date on the IPAD project include an in depth documentation of a representative design process for a large engineering project, the definition and design of computer aided design software needed to support that process, and the release of prototype software to manage engineering information. This paper provides an overview of the IPAD project and summarizes progress to date and future plans.
DHM simulation in virtual environments: a case-study on control room design.
Zamberlan, M; Santos, V; Streit, P; Oliveira, J; Cury, R; Negri, T; Pastura, F; Guimarães, C; Cid, G
2012-01-01
This paper will present the workflow developed for the application of serious games in the design of complex cooperative work settings. The project was based on ergonomic studies and development of a control room among participative design process. Our main concerns were the 3D human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. Using Unity3D platform to design the virtual environment, the virtual human model can be controlled by users on dynamic scenario in order to evaluate the new work settings and simulate work activities. The results obtained showed that this virtual technology can drastically change the design process by improving the level of interaction between final users and, managers and human factors team.
FILTSoft: A computational tool for microstrip planar filter design
NASA Astrophysics Data System (ADS)
Elsayed, M. H.; Abidin, Z. Z.; Dahlan, S. H.; Cholan N., A.; Ngu, Xavier T. I.; Majid, H. A.
2017-09-01
Filters are key component of any communication system to control spectrum and suppress interferences. Designing a filter involves long process as well as good understanding of the basic hardware technology. Hence this paper introduces an automated design tool based on Matlab-GUI, called the FILTSoft (acronym for Filter Design Software) to ease the process. FILTSoft is a user friendly filter design tool to aid, guide and expedite calculations from lumped elements level to microstrip structure. Users just have to provide the required filter specifications as well as the material description. FILTSoft will calculate and display the lumped element details, the planar filter structure, and the expected filter's response. An example of a lowpass filter design was calculated using FILTSoft and the results were validated through prototype measurement for comparison purposes.
Parametric Design within an Atomic Design Process (ADP) applied to Spacecraft Design
NASA Astrophysics Data System (ADS)
Ramos Alarcon, Rafael
This thesis describes research investigating the development of a model for the initial design of complex systems, with application to spacecraft design. The design model is called an atomic design process (ADP) and contains four fundamental stages (specifications, configurations, trade studies and drivers) that constitute the minimum steps of an iterative process that helps designers find a feasible solution. Representative design models from the aerospace industry are reviewed and are compared with the proposed model. The design model's relevance, adaptability and scalability features are evaluated through a focused design task exercise with two undergraduate teams and a long-term design exercise performed by a spacecraft payload team. The implementation of the design model is explained in the context in which the model has been researched. This context includes the organization (a student-run research laboratory at the University of Michigan), its culture (academically oriented), members that have used the design model and the description of the information technology elements meant to provide support while using the model. This support includes a custom-built information management system that consolidates relevant information that is currently being used in the organization. The information is divided in three domains: personnel development history, technical knowledge base and laboratory operations. The focused study with teams making use of the design model to complete an engineering design exercise consists of the conceptual design of an autonomous system, including a carrier and a deployable lander that form the payload of a rocket with an altitude range of over 1000 meters. Detailed results from each of the stages of the design process while implementing the model are presented, and an increase in awareness of good design practices in the teams while using the model are explained. A long-term investigation using the design model consisting of the successful characterization of an imaging system for a spacecraft is presented. The spacecraft is designed to take digital color images from low Earth orbit. The dominant drivers from each stage of the design process are indicated as they were identified, with the accompanying hardware development leading to the final configuration that comprises the flight spacecraft.
Automating expert role to determine design concept in Kansei Engineering
NASA Astrophysics Data System (ADS)
Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd
2016-02-01
Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.
NASA Astrophysics Data System (ADS)
Subramanian, Tenkasi R.
In the current day, with the rapid advancement in technology, engineering design is growing in complexity. Nowadays, engineers have to deal with design problems that are large, complex and involving multi-level decision analyses. With the increase in complexity and size of systems, the production and development cost tend to overshoot the allocated budget and resources. This often results in project delays and project cancellation. This is particularly true for aerospace systems. Value Driven Design proves to be means to strengthen the design process and help counter such trends. Value Driven is a novel framework for optimization which puts stakeholder preferences at the forefront of the design process to capture their true preferences to present system alternatives that are consistent the stakeholder's expectations. Traditional systems engineering techniques promote communication of stakeholder preferences in the form of requirements which confines the design space by imposing additional constraints on it. This results in a design that does not capture the true preferences of the stakeholder. Value Driven Design provides an alternate approach to design wherein a value function is created that corresponds to the true preferences of the stakeholder. The applicability of VDD broad, but it is imperative to first explore its feasibility to ensure the development of an efficient, robust and elegant system design. The key to understanding the usability of VDD is to investigate the formation, propagation and use of a value function. This research investigates the use of rank correlation metrics to ensure consistent rank ordering of design alternatives, while investigating the fidelity of the value function. The impact of design uncertainties on rank ordering. A satellite design system consisting of a satellite, ground station and launch vehicle is used to demonstrate the use of the metrics to aid in decision support during the design process.
Effect of processing parameters on reaction bonding of silicon nitride
NASA Technical Reports Server (NTRS)
Richman, M. H.; Gregory, O. J.; Magida, M. B.
1980-01-01
Reaction bonded silicon nitride was developed. The relationship between the various processing parameters and the resulting microstructures was to design and synthesize reaction bonded materials with improved room temperature mechanical properties.
NASA Astrophysics Data System (ADS)
Choirunnisa, N. L.; Prabowo, P.; Suryanti, S.
2018-01-01
The main objective of this study is to describe the effectiveness of 5E instructional model-based learning to improve primary school students’ science process skills. The science process skills is important for students as it is the foundation for enhancing the mastery of concepts and thinking skills needed in the 21st century. The design of this study was experimental involving one group pre-test and post-test design. The result of this study shows that (1) the implementation of learning in both of classes, IVA and IVB, show that the percentage of learning implementation increased which indicates a better quality of learning and (2) the percentage of students’ science process skills test results on the aspects of observing, formulating hypotheses, determining variable, interpreting data and communicating increased as well.
Processing experiments on non-Czochralski silicon sheet
NASA Technical Reports Server (NTRS)
Pryor, R. A.; Grenon, L. A.; Sakiotis, N. G.; Pastirik, E. M.; Sparks, T. O.; Legge, R. N.
1981-01-01
A program is described which supports and promotes the development of processing techniques which may be successfully and cost-effectively applied to low-cost sheets for solar cell fabrication. Results are reported in the areas of process technology, cell design, cell metallization, and production cost simulation.
Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules
NASA Astrophysics Data System (ADS)
Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix
2009-02-01
Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.
Design of high-performance parallelized gene predictors in MATLAB.
Rivard, Sylvain Robert; Mailloux, Jean-Gabriel; Beguenane, Rachid; Bui, Hung Tien
2012-04-10
This paper proposes a method of implementing parallel gene prediction algorithms in MATLAB. The proposed designs are based on either Goertzel's algorithm or on FFTs and have been implemented using varying amounts of parallelism on a central processing unit (CPU) and on a graphics processing unit (GPU). Results show that an implementation using a straightforward approach can require over 4.5 h to process 15 million base pairs (bps) whereas a properly designed one could perform the same task in less than five minutes. In the best case, a GPU implementation can yield these results in 57 s. The present work shows how parallelism can be used in MATLAB for gene prediction in very large DNA sequences to produce results that are over 270 times faster than a conventional approach. This is significant as MATLAB is typically overlooked due to its apparent slow processing time even though it offers a convenient environment for bioinformatics. From a practical standpoint, this work proposes two strategies for accelerating genome data processing which rely on different parallelization mechanisms. Using a CPU, the work shows that direct access to the MEX function increases execution speed and that the PARFOR construct should be used in order to take full advantage of the parallelizable Goertzel implementation. When the target is a GPU, the work shows that data needs to be segmented into manageable sizes within the GFOR construct before processing in order to minimize execution time.
Applying the Ottawa Charter to inform health promotion programme design.
Fry, Denise; Zask, Avigdor
2017-10-01
There is evidence of a correlation between adoption of the Ottawa Charter's framework of five action areas and health promotion programme effectiveness, but the Charter's framework has not been as fully implemented as hoped, nor is generally used by formal programme design models. In response, we aimed to translate the Charter's framework into a method to inform programme design. Our resulting design process uses detailed definitions of the Charter's action areas and evidence of predicted effectiveness to prompt greater consideration and use of the Charter's framework. We piloted the process by applying it to the design of four programmes of the Healthy Children's Initiative in New South Wales, Australia; refined the criteria via consensus; and made consensus decisions on the extent to which programme designs reflected the Charter's framework. The design process has broad potential applicability to health promotion programmes; facilitating greater use of the Ottawa Charter framework, which evidence indicates can increase programme effectiveness. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Optimal design of an alignment-free two-DOF rehabilitation robot for the shoulder complex.
Galinski, Daniel; Sapin, Julien; Dehez, Bruno
2013-06-01
This paper presents the optimal design of an alignment-free exoskeleton for the rehabilitation of the shoulder complex. This robot structure is constituted of two actuated joints and is linked to the arm through passive degrees of freedom (DOFs) to drive the flexion-extension and abduction-adduction movements of the upper arm. The optimal design of this structure is performed through two steps. The first step is a multi-objective optimization process aiming to find the best parameters characterizing the robot and its position relative to the patient. The second step is a comparison process aiming to select the best solution from the optimization results on the basis of several criteria related to practical considerations. The optimal design process leads to a solution outperforming an existing solution on aspects as kinematics or ergonomics while being more simple.
Job Skills Education Program. Design Specifications
1985-03-01
training approach is supplied in part by research based on the depth-of- processing paradigm ( Craik & Lockhart , 1972; Craik & Tulving, 1975), which...discussion here develops a rationale for the approach, which is consistent with research on incidental learning ( Craik & Lockhart , 1972; Craik & Tulving, 1975...this meeting, a plan evolved to integrate available RCA results and contract products into the - JSEP design. 0 During the Task 1 in- process review, the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernardin, John D; Baca, Allen G
This paper presents the mechanical design, fabrication and dynamic testing of an electrostatic analyzer spacecraft instrument. The functional and environmental requirements combined with limited spacecraft accommodations, resulted in complex component geometries, unique material selections, and difficult fabrication processes. The challenging aspects of the mechanical design and several of the more difficult production processes are discussed. In addition, the successes, failures, and lessons learned from acoustic and random vibration testing of a full-scale prototype instrument are presented.
Application of the user-centred design process according ISO 9241-210 in air traffic control.
König, Christina; Hofmann, Thomas; Bruder, Ralph
2012-01-01
Designing a usable human machine interface for air traffic control is challenging and should follow approved methods. The ISO 9241-210 standard promises high usability of products by integrating future users and following an iterative process. This contribution describes the proceeding and first results of the analysis and application of ISO 9241-210 to develop a planning tool for air traffic controllers.
NASA Astrophysics Data System (ADS)
Aksenova, Olesya; Nikolaeva, Evgenia; Cehlár, Michal
2017-11-01
This work aims to investigate the effectiveness of mathematical and three-dimensional computer modeling tools in the planning of processes of fuel and energy complexes at the planning and design phase of a thermal power plant (TPP). A solution for purification of gas emissions at the design development phase of waste treatment systems is proposed employing mathematical and three-dimensional computer modeling - using the E-nets apparatus and the development of a 3D model of the future gas emission purification system. Which allows to visualize the designed result, to select and scientifically prove economically feasible technology, as well as to ensure the high environmental and social effect of the developed waste treatment system. The authors present results of a treatment of planned technological processes and the system for purifying gas emissions in terms of E-nets. using mathematical modeling in the Simulink application. What allowed to create a model of a device from the library of standard blocks and to perform calculations. A three-dimensional model of a system for purifying gas emissions has been constructed. It allows to visualize technological processes and compare them with the theoretical calculations at the design phase of a TPP and. if necessary, make adjustments.
Terminating DNA Tile Assembly with Nanostructured Caps.
Agrawal, Deepak K; Jiang, Ruoyu; Reinhart, Seth; Mohammed, Abdul M; Jorgenson, Tyler D; Schulman, Rebecca
2017-10-24
Precise control over the nucleation, growth, and termination of self-assembly processes is a fundamental tool for controlling product yield and assembly dynamics. Mechanisms for altering these processes programmatically could allow the use of simple components to self-assemble complex final products or to design processes allowing for dynamic assembly or reconfiguration. Here we use DNA tile self-assembly to develop general design principles for building complexes that can bind to a growing biomolecular assembly and terminate its growth by systematically characterizing how different DNA origami nanostructures interact with the growing ends of DNA tile nanotubes. We find that nanostructures that present binding interfaces for all of the binding sites on a growing facet can bind selectively to growing ends and stop growth when these interfaces are presented on either a rigid or floppy scaffold. In contrast, nucleation of nanotubes requires the presentation of binding sites in an arrangement that matches the shape of the structure's facet. As a result, it is possible to build nanostructures that can terminate the growth of existing nanotubes but cannot nucleate a new structure. The resulting design principles for constructing structures that direct nucleation and termination of the growth of one-dimensional nanostructures can also serve as a starting point for programmatically directing two- and three-dimensional crystallization processes using nanostructure design.
Compressor and Turbine Multidisciplinary Design for Highly Efficient Micro-gas Turbine
NASA Astrophysics Data System (ADS)
Barsi, Dario; Perrone, Andrea; Qu, Yonglei; Ratto, Luca; Ricci, Gianluca; Sergeev, Vitaliy; Zunino, Pietro
2018-06-01
Multidisciplinary design optimization (MDO) is widely employed to enhance turbomachinery components efficiency. The aim of this work is to describe a complete tool for the aero-mechanical design of a radial inflow turbine and a centrifugal compressor. The high rotational speed of such machines and the high exhaust gas temperature (only for the turbine) expose blades to really high stresses and therefore the aerodynamics design has to be coupled with the mechanical one through an integrated procedure. The described approach employs a fully 3D Reynolds Averaged Navier-Stokes (RANS) solver for the aerodynamics and an open source Finite Element Analysis (FEA) solver for the mechanical integrity assessment. Due to the high computational cost of both these two solvers, a meta model, such as an artificial neural network (ANN), is used to speed up the optimization design process. The interaction between two codes, the mesh generation and the post processing of the results are achieved via in-house developed scripting modules. The obtained results are widely presented and discussed.
Piezoresistive in-line integrated force sensors for on-chip measurement and control
NASA Astrophysics Data System (ADS)
Teichert, Kendall; Waterfall, Tyler; Jensen, Brian; Howell, Larry; McLain, Tim
2007-04-01
This paper presents the design, fabrication, and testing of a force sensor for integrated use with thermomechanical in-plane microactuators. The force sensor is designed to be integrated with the actuator and fabricated in the same batch fabrication process. This sensor uses the piezoresistive property of silicon as a sensing signal by directing the actuation force through two thin legs, producing a tensile stress. This tensile load produces a resistance change in the thin legs by the piezoresistive effect. The resistance change is linearly correlated with the applied force. The device presented was designed by considering both its piezoresistive sensitivity and out-of- plane torsional stability. A design trade-off exists between these two objectives in that longer legs are more sensitive yet less stable. Fabrication of the sensor design was done using the MUMPs process. This paper presents experimental results from this device and a basic model for comparison with previously attained piezoresistive data. The results validate the concept of integral sensing using the piezoresistive property of silicon.
NASA Astrophysics Data System (ADS)
Ghasem, Nayef
2016-07-01
This paper illustrates a teaching technique used in computer applications in chemical engineering employed for designing various unit operation processes, where the students learn about unit operations by designing them. The aim of the course is not to teach design, but rather to teach the fundamentals and the function of unit operation processes through simulators. A case study presenting the teaching method was evaluated using student surveys and faculty assessments, which were designed to measure the quality and effectiveness of the teaching method. The results of the questionnaire conclusively demonstrate that this method is an extremely efficient way of teaching a simulator-based course. In addition to that, this teaching method can easily be generalised and used in other courses. A student's final mark is determined by a combination of in-class assessments conducted based on cooperative and peer learning, progress tests and a final exam. Results revealed that peer learning can improve the overall quality of student learning and enhance student understanding.
Theory and Practice Meets in Industrial Process Design -Educational Perspective-
NASA Astrophysics Data System (ADS)
Aramo-Immonen, Heli; Toikka, Tarja
Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.
Defense Acquisitions: Assessments of Selected Weapon Programs
2012-03-01
knowledge-based practices. As a result , most of these programs will carry technology, design, and production risks into subsequent phases of the...acquisition process that could result in cost growth or schedule delays. GAO also assessed the implementation of selected acquisition reforms and found...knowledge-based practices. As a result , most of these programs will carry technology, design, and production risks into subsequent phases of the
NASA Astrophysics Data System (ADS)
Adams, Christopher; Tate, Derrick
Patent textual descriptions provide a wealth of information that can be used to understand the underlying design approaches that result in the generation of novel and innovative technology. This article will discuss a new approach for estimating Degree of Ideality and Level of Invention metrics from the theory of inventive problem solving (TRIZ) using patent textual information. Patent text includes information that can be used to model both the functions performed by a design and the associated costs and problems that affect a design’s value. The motivation of this research is to use patent data with calculation of TRIZ metrics to help designers understand which combinations of system components and functions result in creative and innovative design solutions. This article will discuss in detail methods to estimate these TRIZ metrics using natural language processing and machine learning with the use of neural networks.
NASA Astrophysics Data System (ADS)
Mandala, Mahender Arjun
A cornerstone of design and design education is frequent situated feedback. With increasing class sizes, and shrinking financial and human resources, providing rich feedback to students becomes increasingly difficult. In the field of writing, web-based peer review--the process of utilizing equal status learners within a class to provide feedback to each other on their work using networked computing systems--has been shown to be a reliable and valid source of feedback in addition to improving student learning. Designers communicate in myriad ways, using the many languages of design and combining visual and descriptive information. This complex discourse of design intent makes peer reviews by design students ambiguous and often not helpful to the receivers of this feedback. Furthermore, engaging students in the review process itself is often difficult. Teams can complement individual diversity and may assist novice designers collectively resolve complex task. However, teams often incur production losses and may be impacted by individual biases. In the current work, we look at utilizing a collaborative team of reviewers, working collectively and synchronously, in generating web based peer reviews in a sophomore engineering design class. Students participated in a cross-over design, conducting peer reviews as individuals and collaborative teams in parallel sequences. Raters coded the feedback generated on the basis of their appropriateness and accuracy. Self-report surveys and passive observation of teams conducting reviews captured student opinion on the process, its value, and the contrasting experience they had conducting team and individual reviews. We found team reviews generated better quality feedback in comparison to individual reviews. Furthermore, students preferred conducting reviews in teams, finding the process 'fun' and engaging. We observed several learning benefits of using collaboration in reviewing including improved understanding of the assessment criteria, roles, expectations, and increased team reflection. These results provide insight into how to improve the review process for instructors and researchers, and forms a basis for future research work in this area. With respect to facilitating peer review process in design based classrooms, we also present recommendations for creating effective review system design and implementation in classroom supported by research and practical experience.
Redesigning a risk-management process for tracking injuries.
Wenzel, G R
1998-01-01
The changing responsibilities of registered nurses are challenging even the most dedicated professionals. To survive within her newly-defined roles, one nurse used a total quality improvement model to understand, analyze, and improve a medical center's system for tracking inpatient injuries. This process led to the drafting of an original software design that implemented a nursing informatics tracking system. It has resulted in significant savings of time and money and has far surpassed the accuracy, efficiency, and scope of the previous method. This article presents an overview of the design process.
Leveraging pattern matching to solve SRAM verification challenges at advanced nodes
NASA Astrophysics Data System (ADS)
Kan, Huan; Huang, Lucas; Yang, Legender; Zou, Elaine; Wan, Qijian; Du, Chunshan; Hu, Xinyi; Liu, Zhengfang; Zhu, Yu; Zhang, Recoo; Huang, Elven; Muirhead, Jonathan
2018-03-01
Memory is a critical component in today's system-on-chip (SoC) designs. Static random-access memory (SRAM) blocks are assembled by combining intellectual property (IP) blocks that come from SRAM libraries developed and certified by the foundries for both functionality and a specific process node. Customers place these SRAM IP in their designs, adjusting as necessary to achieve DRC-clean results. However, any changes a customer makes to these SRAM IP during implementation, whether intentionally or in error, can impact yield and functionality. Physical verification of SRAM has always been a challenge, because these blocks usually contain smaller feature sizes and spacing constraints compared to traditional logic or other layout structures. At advanced nodes, critical dimension becomes smaller and smaller, until there is almost no opportunity to use optical proximity correction (OPC) and lithography to adjust the manufacturing process to mitigate the effects of any changes. The smaller process geometries, reduced supply voltages, increasing process variation, and manufacturing uncertainty mean accurate SRAM physical verification results are not only reaching new levels of difficulty, but also new levels of criticality for design success. In this paper, we explore the use of pattern matching to create an SRAM verification flow that provides both accurate, comprehensive coverage of the required checks and visual output to enable faster, more accurate error debugging. Our results indicate that pattern matching can enable foundries to improve SRAM manufacturing yield, while allowing designers to benefit from SRAM verification kits that can shorten the time to market.
CAD/CAM approach to improving industry productivity gathers momentum
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1982-01-01
Recent results and planning for the NASA/industry Integrated Programs for Aerospace-Vehicle Design (IPAD) program for improving productivity with CAD/CAM methods are outlined. The industrial group work is being mainly done by Boeing, and progress has been made in defining the designer work environment, developing requirements and a preliminary design for a future CAD/CAM system, and developing CAD/CAM technology. The work environment was defined by conducting a detailed study of a reference design process, and key software elements for a CAD/CAM system have been defined, specifically for interactive design or experiment control processes. Further work is proceeding on executive, data management, geometry and graphics, and general utility software, and dynamic aspects of the programs being developed are outlined
Business Performer-Centered Design of User Interfaces
NASA Astrophysics Data System (ADS)
Sousa, Kênia; Vanderdonckt, Jean
Business Performer-Centered Design of User Interfaces is a new design methodology that adopts business process (BP) definition and a business performer perspective for managing the life cycle of user interfaces of enterprise systems. In this methodology, when the organization has a business process culture, the business processes of an organization are firstly defined according to a traditional methodology for this kind of artifact. These business processes are then transformed into a series of task models that represent the interactive parts of the business processes that will ultimately lead to interactive systems. When the organization has its enterprise systems, but not yet its business processes modeled, the user interfaces of the systems help derive tasks models, which are then used to derive the business processes. The double linking between a business process and a task model, and between a task model and a user interface model makes it possible to ensure traceability of the artifacts in multiple paths and enables a more active participation of business performers in analyzing the resulting user interfaces. In this paper, we outline how a human-perspective is used tied to a model-driven perspective.
Taguchi experimental design to determine the taste quality characteristic of candied carrot
NASA Astrophysics Data System (ADS)
Ekawati, Y.; Hapsari, A. A.
2018-03-01
Robust parameter design is used to design product that is robust to noise factors so the product’s performance fits the target and delivers a better quality. In the process of designing and developing the innovative product of candied carrot, robust parameter design is carried out using Taguchi Method. The method is used to determine an optimal quality design. The optimal quality design is based on the process and the composition of product ingredients that are in accordance with consumer needs and requirements. According to the identification of consumer needs from the previous research, quality dimensions that need to be assessed are the taste and texture of the product. The quality dimension assessed in this research is limited to the taste dimension. Organoleptic testing is used for this assessment, specifically hedonic testing that makes assessment based on consumer preferences. The data processing uses mean and signal to noise ratio calculation and optimal level setting to determine the optimal process/composition of product ingredients. The optimal value is analyzed using confirmation experiments to prove that proposed product match consumer needs and requirements. The result of this research is identification of factors that affect the product taste and the optimal quality of product according to Taguchi Method.
Landing Gear Integration in Aircraft Conceptual Design. Revision
NASA Technical Reports Server (NTRS)
Chai, Sonny T.; Mason, William H.
1997-01-01
The design of the landing gear is one of the more fundamental aspects of aircraft design. The design and integration process encompasses numerous engineering disciplines, e.g., structure, weights, runway design, and economics, and has become extremely sophisticated in the last few decades. Although the design process is well-documented, no attempt has been made until now in the development of a design methodology that can be used within an automated environment. As a result, the process remains to be a key responsibility for the configuration designer and is largely experience-based and graphically-oriented. However, as industry and government try to incorporate multidisciplinary design optimization (MDO) methods in the conceptual design phase, the need for a more systematic procedure has become apparent. The development of an MDO-capable design methodology as described in this work is focused on providing the conceptual designer with tools to help automate the disciplinary analyses, i.e., geometry, kinematics, flotation, and weight. Documented design procedures and analyses were examined to determine their applicability, and to ensure compliance with current practices and regulations. Using the latest information as obtained from industry during initial industry survey, the analyses were in terms modified and expanded to accommodate the design criteria associated with the advanced large subsonic transports. Algorithms were then developed based on the updated analysis procedures to be incorporated into existing MDO codes.
Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.
Arganda-Carreras, Ignacio; Andrey, Philippe
2017-01-01
With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.
Improving scanner wafer alignment performance by target optimization
NASA Astrophysics Data System (ADS)
Leray, Philippe; Jehoul, Christiane; Socha, Robert; Menchtchikov, Boris; Raghunathan, Sudhar; Kent, Eric; Schoonewelle, Hielke; Tinnemans, Patrick; Tuffy, Paul; Belen, Jun; Wise, Rich
2016-03-01
In the process nodes of 10nm and below, the patterning complexity along with the processing and materials required has resulted in a need to optimize alignment targets in order to achieve the required precision, accuracy and throughput performance. Recent industry publications on the metrology target optimization process have shown a move from the expensive and time consuming empirical methodologies, towards a faster computational approach. ASML's Design for Control (D4C) application, which is currently used to optimize YieldStar diffraction based overlay (DBO) metrology targets, has been extended to support the optimization of scanner wafer alignment targets. This allows the necessary process information and design methodology, used for DBO target designs, to be leveraged for the optimization of alignment targets. In this paper, we show how we applied this computational approach to wafer alignment target design. We verify the correlation between predictions and measurements for the key alignment performance metrics and finally show the potential alignment and overlay performance improvements that an optimized alignment target could achieve.
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi
This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process. The objective of this study is to verify the generalized body-modes approach in comparison to high-fidelity FSI simulations to accurately predict structural deflections and stress loads in a WEC. Two verification cases are considered, a free-floating barge and a fixed-bottom column. Details for both the generalized body-modes models and FSI models are first provided. Results for each of the models are then compared and discussed. Finally, based on the verification results obtained, future plans for incorporating the generalized body-modes method into the WEC simulation tool, WEC-Sim, and the overall WEC design process are discussed.« less
Development of a Prototype Model-Form Uncertainty Knowledge Base
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
Modeling workflow to design machine translation applications for public health practice
Turner, Anne M.; Brownstein, Megumu K.; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin
2014-01-01
Objective Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). Materials and Methods We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. Results The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. Discussion This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. Counclusion The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. PMID:25445922
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-09
... Standards (UPCS) inspection protocol was designed to be a uniform inspection process and standard for HUD's... frequency of inspections based on the results the UPCS inspection. UPCS was designed to assess the condition... physical assessment score. HUD Response: The UPCS inspection protocol as designed assesses the physical...
Hall, Martha L; Lobo, Michele A
2017-05-25
Children with a variety of diagnoses have impairments that limit their arm function. Despite the fact that arm function is important for early learning and activities of daily living, there are few tools to assist movement for these children, and existing devices have challenges related to cost, accessibility, comfort, and aesthetics. In this article, we describe the design process and development of the first garment-based exoskeleton to assist arm movement in young children with movement impairments: the Playskin Lift TM . We outline our design process, which contrasts with the traditional medical model in that it is interdisciplinary, user-centered, and addresses the broad needs of users, rather than device function alone. Then we report the results of field-testing with the initial prototype with respect to our design metrics on a toddler with significant bilateral arm movement impairments. Finally, we summarize our ongoing development aimed at increasing comfort, aesthetics, and accessibility of the garment. The interdisciplinary, user-centered approach to assistive technology design presented here can result in innovative and impactful design solutions that translate to the real world.
Chip Design Process Optimization Based on Design Quality Assessment
NASA Astrophysics Data System (ADS)
Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel
2010-06-01
Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.
Bühler, Mira; Vollstädt-Klein, Sabine; Klemen, Jane; Smolka, Michael N
2008-01-01
Background Existing brain imaging studies, investigating sexual arousal via the presentation of erotic pictures or film excerpts, have mainly used blocked designs with long stimulus presentation times. Methods To clarify how experimental functional magnetic resonance imaging (fMRI) design affects stimulus-induced brain activity, we compared brief event-related presentation of erotic vs. neutral stimuli with blocked presentation in 10 male volunteers. Results Brain activation differed depending on design type in only 10% of the voxels showing task related brain activity. Differences between blocked and event-related stimulus presentation were found in occipitotemporal and temporal regions (Brodmann Area (BA) 19, 37, 48), parietal areas (BA 7, 40) and areas in the frontal lobe (BA 6, 44). Conclusion Our results suggest that event-related designs might be a potential alternative when the core interest is the detection of networks associated with immediate processing of erotic stimuli. Additionally, blocked, compared to event-related, stimulus presentation allows the emergence and detection of non-specific secondary processes, such as sustained attention, motor imagery and inhibition of sexual arousal. PMID:18647397
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
Design of an FMCW radar baseband signal processing system for automotive application.
Lin, Jau-Jr; Li, Yuan-Ping; Hsu, Wei-Chiang; Lee, Ta-Sung
2016-01-01
For a typical FMCW automotive radar system, a new design of baseband signal processing architecture and algorithms is proposed to overcome the ghost targets and overlapping problems in the multi-target detection scenario. To satisfy the short measurement time constraint without increasing the RF front-end loading, a three-segment waveform with different slopes is utilized. By introducing a new pairing mechanism and a spatial filter design algorithm, the proposed detection architecture not only provides high accuracy and reliability, but also requires low pairing time and computational loading. This proposed baseband signal processing architecture and algorithms balance the performance and complexity, and are suitable to be implemented in a real automotive radar system. Field measurement results demonstrate that the proposed automotive radar signal processing system can perform well in a realistic application scenario.
Shen, Jin-Jing; Gong, Xing-Chu; Pan, Jian-Yang; Qu, Hai-Bin
2017-03-01
Design space approach was applied in this study to optimize the lime milk precipitation process of Lonicera Japonica (Jinyinhua) aqueous extract. The evaluation indices for this process were total organic acid purity and amounts of 6 organic acids obtained from per unit mass of medicinal materials. Four critical process parameters (CPPs) including drop speed of lime milk, pH value after adding lime milk, settling time and settling temperature were identified by using the weighted standardized partial regression coefficient method. Quantitative models between process evaluation indices and CPPs were established by a stepwise regression analysis. A design space was calculated by a Monte-Carlo simulation method, and then verified. The verification test results showed that the operation within the design space can guarantee the stability of the lime milk precipitation process. The recommended normal operation space is as follows: drop speed of lime milk of 1.00-1.25 mL•min⁻¹, pH value of 11.5-11.7, settling time of 1.0-1.2 h, and settling temperature of 10-20 ℃.. Copyright© by the Chinese Pharmaceutical Association.
Urban Planning by Le Corbusier According to Praxeological Knowledge
NASA Astrophysics Data System (ADS)
Dzwierzynska, Jolanta; Prokopska, Aleksandra
2017-12-01
The city is formed as a mosaic of various elements which affect its attractiveness. These elements range from location attributes, through economic opportunities, to social aspects. Therefore, urbanity and urban planning should be considered in a multi-dimensional context. In the paper we address the problem of urban planning by Le Corbusier according to praxeological and system knowledge. From praxeological point of view an active human being takes his/her choice between various possibilities by preferring one of these possibilities to the others, and by manifesting it by her actions. The same applies to the design process. Due to this fact, the scientific design process can be treated as a systematic rational reconstruction of the designer’s behaviour. Such a reconstruction requires previous reflection on designer’s work, as well as some consideration and design experience, thus know-how knowledge based on methodological knowledge. In the paper several city visions of Le Corbusier, as well as the characteristics and organisation of his design process are analysed. Le Corbusier’s innovative design ideas resulted from industrialisation changes and motorisation accelerating progress, which gave foundation to a new urban array. This array based on strict geometric forms, regularity and repetition determining standard. Thanks to his theories, Le Corbusier established principles of modern city construction and planning. Although some doubts were expressed as to the scale of centralisation of the cities designed by him and his class-based conception, he was awarded that overall welfare of the individual living in a city was the quality of built environment. Therefore, his designed creations were not only functional but they also produced emotions. The analysis of his prolific design activities allows us to state that the organisation of his architectural and urban planning process was very efficient and complex. The city concepts proposed by him were the subject of analysis by generations of designers. Also now, they can still be the basis for modelling virtual and navigable cities by modern planners. Le Corbusier’s comprehensive approach to modern city planning showed that research activities, that is theoretical thinking, and production activities, that is practice, are linked methodically. Therefore, urban planning should be understood not only as a projection of the possibilities of architecture, but as a multidisciplinary process. Due to this fact, an urban plan, as a result of that process, should be a synthesis of various social, industrial and economic aspects.
ERIC Educational Resources Information Center
Christian, C. A.; Eisenhamer, B.; Eisenhamer, Jonathan; Teays, Terry
2001-01-01
Introduces the Amazing Space program which is designed to enhance student mathematics, science, and technology skills using recent data and results from the National Aeronautics and Space Administration's (NASA) Hubble Space Telescope mission. Explains the process of designing multi-media resources in a five-week summer workshop that partners…
DOT National Transportation Integrated Search
1981-02-01
This volume documents the results of an analysis of the impact that various truck size and weight limits have on the carrier equipment selection process as a result of changes, in the design payload and design density of individual trucks. An analysi...
Computer-Aided Process Planning for the Layered Fabrication of Porous Scaffold Matrices
NASA Astrophysics Data System (ADS)
Starly, Binil
Rapid Prototyping (RP) technology promises to have a tremendous impact on the design and fabrication of porous tissue replacement structures for applications in tissue engineering and regenerative medicine. The layer-by-layer fabrication technology enables the design of patient-specific medical implants and complex structures for diseased tissue replacement strategies. Combined with advancements in imaging modalities and bio-modeling software, physicians can engage themselves in advanced solutions for craniofacial and mandibular reconstruction. For example, prior to the advancement of RP technologies, solid titanium parts used as implants for mandibular reconstruction were fashioned out of molding or CNC-based machining processes (Fig. 3.1). Titanium implants built using this process are often heavy, leading to increased patient discomfort. In addition, the Young's modulus of titanium is almost five times that of healthy cortical bone resulting in stress shielding effects [1,2]. With the advent of CAD/CAM-based tools, the virtual reconstruction of the implants has resulted in significant design improvements. The new generation of implants can be porous, enabling the in-growth of healthy bone tissue for additional implant fixation and stabilization. Newer implants would conform to the external shape of the defect site that is intended to be filled in. More importantly, the effective elastic modulus of the implant can be designed to match that of surrounding tissue. Ideally, the weight of the implant can be designed to equal the weight of the tissue that is being replaced resulting in increased patient comfort. Currently, such porous structures for reconstruction can only be fabricated using RP-based metal fabrication technologies such as Electron Beam Melting (EBM), Selective Laser Sintering (SLS®), and 3D™ Printing processes.
Process optimization by use of design of experiments: Application for liposomalization of FK506.
Toyota, Hiroyasu; Asai, Tomohiro; Oku, Naoto
2017-05-01
Design of experiments (DoE) can accelerate the optimization of drug formulations, especially complexed formulas such as those of drugs, using delivery systems. Administration of FK506 encapsulated in liposomes (FK506 liposomes) is an effective approach to treat acute stroke in animal studies. To provide FK506 liposomes as a brain protective agent, it is necessary to manufacture these liposomes with good reproducibility. The objective of this study was to confirm the usefulness of DoE for the process-optimization study of FK506 liposomes. The Box-Behnken design was used to evaluate the effect of the process parameters on the properties of FK506 liposomes. The results of multiple regression analysis showed that there was interaction between the hydration temperature and the freeze-thaw cycle on both the particle size and encapsulation efficiency. An increase in the PBS hydration volume resulted in an increase in encapsulation efficiency. Process parameters had no effect on the ζ-potential. The multiple regression equation showed good predictability of the particle size and the encapsulation efficiency. These results indicated that manufacturing conditions must be taken into consideration to prepare liposomes with desirable properties. DoE would thus be promising approach to optimize the conditions for the manufacturing of liposomes. Copyright © 2017 Elsevier B.V. All rights reserved.
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
Structural Technology and Analysis Program (STAP) Delivery Order 0004: Durability Patch
NASA Astrophysics Data System (ADS)
Ikegami, Roy; Haugse, Eric; Trego, Angela; Rogers, Lynn; Maly, Joe
2001-06-01
Structural cracks in secondary structure, resulting from a high cycle fatigue (HCF) environment, are often referred to as nuisance cracks. This type of damage can result in costly inspections and repair. The repairs often do not last long because the repaired structure continues to respond in a resonant fashion to the environment. Although the use of materials for passive damping applications is well understood, there are few applications to high-cycle fatigue problems. This is because design information characterization temperature, resonant response frequency and strain levels are difficult to determine. The Durability Patch and Damage Dosimeter Program addressed these problems by: (1) Developing a damped repair design process which includes a methodology for designing the material and application characteristics required to optimally damp the repair. (2) Designing and developing a rugged, small, and lightweight data acquisition unit called the damage dosimeter. This is a battery operated, single board computer, capable of collecting three channels of strain and one channel of temperature, processing this data by user developed algorithms written in the C programming language, and storing the processed data in resident memory. The dosimeter is used to provide flight data needed to characterize the vibration environment. The vibration environment is then used to design the damping material characteristics and repair. The repair design methodology and dosimeter were demonstrated on B-52, C-130, and F-15 aircraft applications.
49 CFR 236.1009 - Procedural requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... fraud; (ii) Potentially invalidated assumptions determined as a result of in-service experience or one... inspect processes, procedures, facilities, documents, records, design and testing materials, artifacts, training materials and programs, and any other information used in the design, development, manufacture...
Design of a low cost earth resources system
NASA Technical Reports Server (NTRS)
Faust, N. L.; Furman, M. D.; Spann, G. W. (Principal Investigator)
1978-01-01
The author has identified the following significant results. Survey results indicated that users of remote sensing data in the Southeastern U.S. were increasingly turning to digital processing techniques. All the states surveyed have had some involvement in projects using digitally processed data. Even those states which do not yet have in-house capabilities for digital processing were extremely interested in and were planning to develop such capabilities.
Inauen, A; Jenny, G J; Bauer, G F
2012-06-01
This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.
Further Development and Assessment of a Broadband Liner Optimization Process
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.
2016-01-01
The utilization of advanced fan designs (including higher bypass ratios) and shorter engine nacelles has highlighted a need for increased fan noise reduction over a broader frequency range. Thus, improved broadband liner designs must account for these constraints and, where applicable, take advantage of advanced manufacturing techniques that have opened new possibilities for novel configurations. This work focuses on the use of an established broadband acoustic liner optimization process to design a variable-depth, multi-degree of freedom liner for a high speed fan. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design a liner aimed at producing impedance spectra that most closely match the predicted optimum values. The multi-degree of freedom design is carried through design, fabrication, and testing. In-duct attenuation predictions compare well with measured data and the multi-degree of freedom liner is shown to outperform a more conventional liner over a range of flow conditions. These promising results provide further confidence in the design tool, as well as the enhancements made to the overall design process.
DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology
Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng
2015-01-01
Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437
Gossett, Andrea; Mirza, Mansha; Barnds, Ann Kathleen; Feidt, Daisy
2009-11-01
A growing emphasis has been placed on providing equal opportunities for all people, particularly people with disabilities, to support participation. Barriers to participation are represented in part by physical space restrictions. This article explores the decision-making process during the construction of a new office building housing a disability-rights organization. The building project featured in this study was developed on the principles of universal design, maximal accessibility, and sustainability to support access and participation. A qualitative case study approach was used involving collection of data through in-depth interviews with key decision-makers; non-participant observations at design meetings; and on-site tours. Qualitative thematic analysis along with the development of a classification system was used to understand specific building elements and the relevant decision processes from which they resulted. Recording and analyzing the design process revealed several key issues including grassroots involvement of stakeholders; interaction between universal design and sustainable design; addressing diversity through flexibility and universality; and segregationist accessibility versus universal design. This case study revealed complex interactions between accessibility, universal design, and sustainability. Two visual models were proposed to understand and analyze these complexities.
Modelling Feedback in Virtual Patients: An Iterative Approach.
Stathakarou, Natalia; Kononowicz, Andrzej A; Henningsohn, Lars; McGrath, Cormac
2018-01-01
Virtual Patients (VPs) offer learners the opportunity to practice clinical reasoning skills and have recently been integrated in Massive Open Online Courses (MOOCs). Feedback is a central part of a branched VP, allowing the learner to reflect on the consequences of their decisions and actions. However, there is insufficient guidance on how to design feedback models within VPs and especially in the context of their application in MOOCs. In this paper, we share our experiences from building a feedback model for a bladder cancer VP in a Urology MOOC, following an iterative process in three steps. Our results demonstrate how we can systematize the process of improving the quality of VP components by the application of known literature frameworks and extend them with a feedback module. We illustrate the design and re-design process and exemplify with content from our VP. Our results can act as starting point for discussions on modelling feedback in VPs and invite future research on the topic.
Ma, En; Xu, Zhenming
2013-12-15
In this study, a technology process including vacuum pyrolysis and vacuum chlorinated separation was proposed to convert waste liquid crystal display (LCD) panels into useful resources using self-design apparatuses. The suitable pyrolysis temperature and pressure are determined as 300°C and 50 Pa at first. The organic parts of the panels were converted to oil (79.10 wt%) and gas (2.93 wt%). Then the technology of separating indium was optimized by central composite design (CCD) under response surface methodology (RSM). The results indicated the indium recovery ratio was 99.97% when the particle size is less than 0.16 mm, the weight percentage of NH4Cl to glass powder is 50 wt% and temperature is 450°C. The research results show that the organic materials, indium and glass of LCD panel can be recovered during the recovery process efficiently and eco-friendly. Copyright © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Ostuzzi, Francesca; Conradie, Peter; De Couvreur, Lieven; Detand, Jan; Saldien, Jelle
2016-01-01
This case study explores the opportunities for students of Industrial Design Engineering to engage with direct and indirect stakeholders by making their design process and results into open-ended designed solutions. The reported case study involved 47 students during a two-weeks intensive course on the topic of urban gardening. Observations were…
NASA Astrophysics Data System (ADS)
Schuck, Miller Harry
Automotive head-up displays require compact, bright, and inexpensive imaging systems. In this thesis, a compact head-up display (HUD) utilizing liquid-crystal-on-silicon microdisplay technology is presented from concept to implementation. The thesis comprises three primary areas of HUD research: the specification, design and implementation of a compact HUD optical system, the development of a wafer planarization process to enhance reflective device brightness and light immunity and the design, fabrication and testing of an inexpensive 640 x 512 pixel active matrix backplane intended to meet the HUD requirements. The thesis addresses the HUD problem at three levels, the systems level, the device level, and the materials level. At the systems level, the optical design of an automotive HUD must meet several competing requirements, including high image brightness, compact packaging, video-rate performance, and low cost. An optical system design which meets the competing requirements has been developed utilizing a fully-reconfigurable reflective microdisplay. The design consists of two optical stages, the first a projector stage which magnifies the display, and a second stage which forms the virtual image eventually seen by the driver. A key component of the optical system is a diffraction grating/field lens which forms a large viewing eyebox while reducing the optical system complexity. Image quality biocular disparity and luminous efficacy were analyzed and results of the optical implementation are presented. At the device level, the automotive HUD requires a reconfigurable, video-rate, high resolution image source for applications such as navigation and night vision. The design of a 640 x 512 pixel active matrix backplane which meets the requirements of the HUD is described. The backplane was designed to produce digital field sequential color images at video rates utilizing fast switching liquid crystal as the modulation layer. The design methodology is discussed, and the example of a clock generator is described from design to implementation. Electrical and optical test results of the fabricated backplane are presented. At the materials level, a planarization method was developed to meet the stringent brightness requirements of automotive HUD's. The research efforts described here have resulted in a simple, low cost post-processing method for planarizing microdisplay substrates based on a spin-cast polymeric resin, benzocyclobutene (BCB). Six- fold reductions in substrate step height were accomplished with a single coating. Via masking and dry etching methods were developed. High reflectivity metal was deposited and patterned over the planarized substrate to produce high aperture pixel mirrors. The process is simple, rapid, and results in microdisplays better able to meet the stringent requirements of high brightness display systems. Methods and results of the post- processing are described.
“A System for Automatically Maintaining Pressure in a Commercial Truck Tire”
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maloney, John
2017-07-07
Under-inflated tires significantly reduce a vehicle’s fuel efficiency by increasing rolling resistance (drag force). The Air Maintenance Technology (“AMT”) system developed through this project replenishes lost air and maintains optimal tire cavity pressure whenever the tire is rolling in service, thus improving overall fuel economy by reducing the tire’s rolling resistance. The system consists of an inlet air filter, an air pump driven by tire deformation during rotation, and a pressure regulating device. Pressurized air in the tire cavity naturally escapes by diffusion through the tire and wheel, leaks in tire seating, and through the filler valve and its seating.more » As a result, tires require constant maintenance to replenish lost air. Since manual tire inflation maintenance is both labor intensive and time consuming, it is frequently overlooked or ignored. By automating the maintenance of optimal tire pressure, the tire’s contribution to the vehicle’s overall fuel economy can be maximized. The work was divided into three phases. The objectives of Phase 1, Planning and Initial Design, resulted in an effective project plan and to create a baseline design. The objectives for Phase 2, Design and Process Optimization, were: to identify finalized design for the pump, regulator and filter components; identify a process to build prototype tires; assemble prototype tires; test prototype tires and document results. The objectives of Phase 3, Design Release and Industrialization, were to finalize system tire assembly, perform release testing and industrialize the assembly process.« less
Participatory design in Parkinson's research with focus on the symptomatic domains to be measured.
Serrano, J Artur; Larsen, Frank; Isaacs, Tom; Matthews, Helen; Duffen, Joy; Riggare, Sara; Capitanio, Fulvio; Ferreira, Joaquim J; Domingos, Josefa; Maetzler, Walter; Graessner, Holm
2015-01-01
There is a growing interest in the objective assessment of health related outcomes using technology providing quality measurements to be applied not only in daily clinical practice, but also in scientific research. Differences in the understandings of the condition and the terminology used between people with Parkinson's (PwPs), clinicians and technical developers may influence the progress of a participatory design process. This paper reports on a participatory design process to achieve a consensus among PwPs, clinicians and technologists over the selection of a set of symptomatic domains to be continuously assessed, in order to provide results relevant to both PwPs and clinicians. The methods used were a Web based user survey, end-user focus groups, ranking by combined methods, a Delphi process performed among clinicians and scientists, and prioritization of the results in a concertation workshop for PwPs, clinicians and technologists. The following symptomatic domains were commonly agreed by PwPs and clinicians to be of central importance in a system of continuous assessment: hypokinesia/bradykinesia, tremor, sway, gait, sleep and cognition. This list satisfied both the needs of the PwPs and the concerns of the clinicians regarding the means of advancing new strategies in assessment and interventions in PD. A participatory design strategy allowed the definition of a consensual list of symptomatic domains. Both the strategy and the achieved results may be of relevance for similar interdisciplinary approaches in the field of PD using a participatory design involving patients, clinicians and technologists.
celerite: Scalable 1D Gaussian Processes in C++, Python, and Julia
NASA Astrophysics Data System (ADS)
Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth
2017-09-01
celerite provides fast and scalable Gaussian Process (GP) Regression in one dimension and is implemented in C++, Python, and Julia. The celerite API is designed to be familiar to users of george and, like george, celerite is designed to efficiently evaluate the marginalized likelihood of a dataset under a GP model. This is then be used alongside a non-linear optimization or posterior inference library for the best results.
Development of a Low-Cost UAV Doppler Radar Data System
NASA Technical Reports Server (NTRS)
Knuble, Joseph; Li, Lihua; Heymsfield, Gerry
2005-01-01
A viewgraph presentation on the design of a low cost unmanned aerial vehicle (UAV) doppler radar data system is presented. The topics include: 1) Science and Mission Background; 2) Radar Requirements and Specs; 3) Radar Realization: RF System; 4) Processing of RF Signal; 5) Data System Design Process; 6) Can We Remove the DSP? 7) Determining Approximate Speed Requirements; 8) Radar Realization: Data System; 9) Data System Operation; and 10) Results.
Design of High Quality Chemical XOR Gates with Noise Reduction.
Wood, Mackenna L; Domanskyi, Sergii; Privman, Vladimir
2017-07-05
We describe a chemical XOR gate design that realizes gate-response function with filtering properties. Such gate-response function is flat (has small gradients) at and in the vicinity of all the four binary-input logic points, resulting in analog noise suppression. The gate functioning involves cross-reaction of the inputs represented by pairs of chemicals to produce a practically zero output when both are present and nearly equal. This cross-reaction processing step is also designed to result in filtering at low output intensities by canceling out the inputs if one of the latter has low intensity compared with the other. The remaining inputs, which were not reacted away, are processed to produce the output XOR signal by chemical steps that result in filtering at large output signal intensities. We analyze the tradeoff resulting from filtering, which involves loss of signal intensity. We also discuss practical aspects of realizations of such XOR gates. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Woodbury, Sarah K.
2008-01-01
The introduction of United Space Alliance's Human Engineering Modeling and Performance Laboratory began in early 2007 in an attempt to address the problematic workspace design issues that the Space Shuttle has imposed on technicians performing maintenance and inspection operations. The Space Shuttle was not expected to require the extensive maintenance it undergoes between flights. As a result, extensive, costly resources have been expended on workarounds and modifications to accommodate ground processing personnel. Consideration of basic human factors principles for design of maintenance is essential during the design phase of future space vehicles, facilities, and equipment. Simulation will be needed to test and validate designs before implementation.
Stochastic simulation and robust design optimization of integrated photonic filters
NASA Astrophysics Data System (ADS)
Weng, Tsui-Wei; Melati, Daniele; Melloni, Andrea; Daniel, Luca
2017-01-01
Manufacturing variations are becoming an unavoidable issue in modern fabrication processes; therefore, it is crucial to be able to include stochastic uncertainties in the design phase. In this paper, integrated photonic coupled ring resonator filters are considered as an example of significant interest. The sparsity structure in photonic circuits is exploited to construct a sparse combined generalized polynomial chaos model, which is then used to analyze related statistics and perform robust design optimization. Simulation results show that the optimized circuits are more robust to fabrication process variations and achieve a reduction of 11%-35% in the mean square errors of the 3 dB bandwidth compared to unoptimized nominal designs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Apelian
2007-07-23
The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.
Word Processing and the Writing Process: Enhancement or Distraction?
ERIC Educational Resources Information Center
Dalton, David W.; Watson, James F.
This study examined the effects of a year-long word processing program on learners' holistic writing skills. Based on results of a writing pretest, 80 seventh grade students were designated as relatively high or low in prior writing achievement and assigned to one of two groups: a word processing treatment and a conventional writing process…
Reduction of oxygen concentration by heater design during Czochralski Si growth
NASA Astrophysics Data System (ADS)
Zhou, Bing; Chen, Wenliang; Li, Zhihui; Yue, Ruicun; Liu, Guowei; Huang, Xinming
2018-02-01
Oxygen is one of the highest-concentration impurities in single crystals grown by the Czochralski (CZ) process, and seriously impairs the quality of the Si wafer. In this study, computer simulations were applied to design a new CZ system. A more appropriate thermal field was acquired by optimization of the heater structure. The simulation results showed that, compared with the conventional system, the oxygen concentration in the newly designed CZ system was reduced significantly throughout the entire CZ process because of the lower crucible wall temperature and optimized convection. To verify the simulation results, experiments were conducted on an industrial single-crystal furnace. The experimental results showed that the oxygen concentration was reduced significantly, especially at the top of the CZ-Si ingot. Specifically, the oxygen concentration was 6.19 × 1017 atom/cm3 at the top of the CZ-Si ingot with the newly designed CZ system, compared with 9.22 × 1017 atom/cm3 with the conventional system. Corresponding light-induced degradation of solar cells based on the top of crystals from the newly designed CZ system was 1.62%, a reduction of 0.64% compared with crystals from the conventional system (2.26%).
Applied Integrated Design in Composite UAV Development
NASA Astrophysics Data System (ADS)
Vasić, Zoran; Maksimović, Stevan; Georgijević, Dragutin
2018-04-01
This paper presents a modern approach to integrated development of Unmanned Aerial Vehicle made of laminated composite materials from conceptual design, through detail design, strength and stiffness analyses, definition and management of design and production data, detailed tests results and other activities related to development of laminated composite structures with main of its particularities in comparison to metal structures. Special attention in this work is focused to management processes of product data during life cycle of an UAV and experimental tests of its composite wing. Experience shows that the automation management processes of product data during life cycle, as well as processes of manufacturing, are inevitable if a company wants to get cheaper and quality composite aircraft structures. One of the most effective ways of successful management of product data today is Product Life cycle Management (PLM). In terms of the PLM, a spectrum of special measures and provisions has to be implemented when defining fiber-reinforced composite material structures in comparison to designing with metals which is elaborated in the paper.
Integrating reliability and maintainability into a concurrent engineering environment
NASA Astrophysics Data System (ADS)
Phillips, Clifton B.; Peterson, Robert R.
1993-02-01
This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.
Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase
NASA Astrophysics Data System (ADS)
Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki
2013-09-01
In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.
Real, Kevin; Fay, Lindsey; Isaacs, Kathy; Carll-White, Allison; Schadler, Aric
2018-01-01
This study utilizes systems theory to understand how changes to physical design structures impact communication processes and patient and staff design-related outcomes. Many scholars and researchers have noted the importance of communication and teamwork for patient care quality. Few studies have examined changes to nursing station design within a systems theory framework. This study employed a multimethod, before-and-after, quasi-experimental research design. Nurses completed surveys in centralized units and later in decentralized units ( N = 26 pre , N = 51 post ). Patients completed surveys ( N = 62 pre ) in centralized units and later in decentralized units ( N = 49 post ). Surveys included quantitative measures and qualitative open-ended responses. Patients preferred the decentralized units because of larger single-occupancy rooms, greater privacy/confidentiality, and overall satisfaction with design. Nurses had a more complex response. Nurses approved the patient rooms, unit environment, and noise levels in decentralized units. However, they reported reduced access to support spaces, lower levels of team/mentoring communication, and less satisfaction with design than in centralized units. Qualitative findings supported these results. Nurses were more positive about centralized units and patients were more positive toward decentralized units. The results of this study suggest a need to understand how system components operate in concert. A major contribution of this study is the inclusion of patient satisfaction with design, an important yet overlooked fact in patient satisfaction. Healthcare design researchers and practitioners may consider how changing system interdependencies can lead to unexpected changes to communication processes and system outcomes in complex systems.
Power processing methodology. [computerized design of spacecraft electric power systems
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hansen, I. G.; Hayden, J. H.
1974-01-01
Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.
Properties of a center/surround retinex. Part 1: Signal processing design
NASA Technical Reports Server (NTRS)
Rahaman, Zia-Ur
1995-01-01
The last version of Edwin Land's retinex model for human vision's lightness and color constancy has been implemented. Previous research has established the mathematical foundations of Land's retinex but has not examined specific design issues and their effects on the properties of the retinex operation. Here we describe the signal processing design of the retinex. We find that the placement of the logarithmic function is important and produces best results when placed after the surround formation. We also find that best rendition is obtained for a 'canonical' gain-offset applied after the retinex operation.
Experiences on developing digital down conversion algorithms using Xilinx system generator
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi
2013-07-01
The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.
NASA Technical Reports Server (NTRS)
Stone, M. S.; Mcadam, P. L.; Saunders, O. W.
1977-01-01
The results are presented of a 4 month study to design a hybrid analog/digital receiver for outer planet mission probe communication links. The scope of this study includes functional design of the receiver; comparisons between analog and digital processing; hardware tradeoffs for key components including frequency generators, A/D converters, and digital processors; development and simulation of the processing algorithms for acquisition, tracking, and demodulation; and detailed design of the receiver in order to determine its size, weight, power, reliability, and radiation hardness. In addition, an evaluation was made of the receiver's capabilities to perform accurate measurement of signal strength and frequency for radio science missions.
A Data Envelopment Analysis Model for Selecting Material Handling System Designs
NASA Astrophysics Data System (ADS)
Liu, Fuh-Hwa Franklin; Kuo, Wan-Ting
The material handling system under design is an unmanned job shop with an automated guided vehicle that transport loads within the processing machines. The engineering task is to select the design alternatives that are the combinations of the four design factors: the ratio of production time to transportation time, mean job arrival rate to the system, input/output buffer capacities at each processing machine, and the vehicle control strategies. Each of the design alternatives is simulated to collect the upper and lower bounds of the five performance indices. We develop a Data Envelopment Analysis (DEA) model to assess the 180 designs with imprecise data of the five indices. The three-ways factorial experiment analysis for the assessment results indicates the buffer capacity and the interaction of job arrival rate and buffer capacity affect the performance significantly.
Training Students’ Science Process Skills through Didactic Design on Work and Energy
NASA Astrophysics Data System (ADS)
Ramayanti, S.; Utari, S.; Saepuzaman, D.
2017-09-01
Science Process Skills (SPS) has not been optimally trained to the students in the learning activity. The aim of this research is finding the ways to train SPS on the subject of Work and Energy. One shot case study design is utilized in this research that conducted on 32 students in one of the High Schools in Bandung. The students’ SPS responses were analyzed by the development SPS based assessment portfolios. The results of this research showed the didactic design that had been designed to training the identifying variables skills, formulating hypotheses, and the experiment activity shows the development. But the didactic design to improve the students’ predicting skills shows that the development is still not optimal. Therefore, in the future studies need to be developed the didactic design on the subject Work and Energy that exercising these skills.
Extraction of astaxanthin from microalgae: process design and economic feasibility study
NASA Astrophysics Data System (ADS)
Zgheib, Nancy; Saade, Roxana; Khallouf, Rindala; Takache, Hosni
2018-03-01
In this work, the process design and the economic feasibility of natural astaxanthin extraction fromHaematococcus pluvialisspecies have been reported. Complete process drawing of the process was first performed, and then the process was designed including five main steps being the harvesting process, the cell disruption, the spray drying, the supercritical CO2extraction and the anaerobic digestion. The major components of the facility would include sedimentation tanks, a disk stack centrifuge, a bed miller, a spray dryer, a multistage compressor, an extractor, a pasteurizer and a digester. All units have been sized assuming a 10 kg/h of dried biomass as a feedstock to produce nearly 2592 kg of astaxanthin per year. The investment payback time and the return on investment were all estimated for different market prices of astaxanthin. Based on the results the production process was found to become economically feasible for a market price higher than 1500/Kg. Also, a payback period of 1 year and an ROI equal to 113% was estimated for an astaxanthin market price equal to 6000/Kg.
NASA Astrophysics Data System (ADS)
Wang, Xiao; Zhang, Cheng; Li, Pin; Wang, Kai; Hu, Yang; Zhang, Peng; Liu, Huixia
2012-11-01
A central composite rotatable experimental design(CCRD) is conducted to design experiments for laser transmission joining of thermoplastic-Polycarbonate (PC). The artificial neural network was used to establish the relationships between laser transmission joining process parameters (the laser power, velocity, clamp pressure, scanning number) and joint strength and joint seam width. The developed mathematical models are tested by analysis of variance (ANOVA) method to check their adequacy and the effects of process parameters on the responses and the interaction effects of key process parameters on the quality are analyzed and discussed. Finally, the desirability function coupled with genetic algorithm is used to carry out the optimization of the joint strength and joint width. The results show that the predicted results of the optimization are in good agreement with the experimental results, so this study provides an effective method to enhance the joint quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nabeel A. Riza
The goals of the first six months of this project were to lay the foundations for both the SiC front-end optical chip fabrication as well as the free-space laser beam interferometer designs and preliminary tests. In addition, a Phase I goal was to design and experimentally build the high temperature and pressure infrastructure and test systems that will be used in the next 6 months for proposed sensor experimentation and data processing. All these goals have been achieved and are described in detail in the report. Both design process and diagrams for the mechanical elements as well as the opticalmore » systems are provided. In addition, photographs of the fabricated SiC optical chips, the high temperature & pressure test chamber instrument, the optical interferometer, the SiC sample chip holder, and signal processing data are provided. The design and experimentation results are summarized to give positive conclusions on the proposed novel high temperature optical sensor technology.« less
Modeling induction heater temperature distribution in polymeric material
NASA Astrophysics Data System (ADS)
Sorokin, A. G.; Filimonova, O. V.
2017-10-01
An induction heating system has a number of inherent benefits compared to traditional heating systems due to a non-contact heating process. The main interesting area of the induction heating process is the efficiency of the usage of energy, choice of the plate material and different coil configurations based on application. Correctly designed, manufactured and maintained induction coils are critical to the overall efficiency of induction heating solutions. The paper describes how the induction heating system in plastic injection molding is designed. The use of numerical simulation in order to get the optimum design of the induction coil is shown. The purpose of this work is to consider various coil configurations used in the induction heating process, which is widely used in plastic molding. Correctly designed, manufactured and maintained induction coils are critical to the overall efficiency of induction heating solutions. The results of calculation are in the numerical model.
Design of optimum solid oxide membrane electrolysis cells for metals production
Guan, Xiaofei; Pal, Uday B.
2015-12-24
Oxide to metal conversion is one of the most energy-intensive steps in the value chain for metals production. Solid oxide membrane (SOM) electrolysis process provides a general route for directly reducing various metal oxides to their respective metals, alloys, or intermetallics. Because of its lower energy use and ability to use inert anode resulting in zero carbon emission, SOM electrolysis process emerges as a promising technology that can replace the state-of-the-art metals production processes. In this paper, a careful study of the SOM electrolysis process using equivalent DC circuit modeling is performed and correlated to the experimental results. Finally, amore » discussion on relative importance of each resistive element in the circuit and on possible ways of lowering the rate-limiting resistive elements provides a generic guideline for designing optimum SOM electrolysis cells.« less
Process quality planning of quality function deployment for carrot syrup
NASA Astrophysics Data System (ADS)
Ekawati, Yurida; Noya, Sunday; Widjaja, Filemon
2017-06-01
Carrot products are rarely available in the market. Based on previous research that had been done using QFD to generate product design of carrots products, the research to produce the process quality planning had been carried out. The carrot product studied was carrot syrup. The research resulted in a process planning matrix for carrot syrup. The matrix gives information about critical process plan and the priority of the critical process plan. The critical process plan on the production process of carrot syrup consists of carrots sorting, carrots peeling, carrots washing, blanching process, carrots cutting, the making of pureed carrots, filtering carrot juice, the addition of sugar in carrot juice, the addition of food additives in carrot juice, syrup boiling, syrup filtering, syrup filling into the bottle, the bottle closure and cooling. The information will help the design of the production process of carrot syrup.
NASA Astrophysics Data System (ADS)
Boravelli, Sai Chandra Teja
This thesis mainly focuses on design and process development of a downdraft biomass gasification processes. The objective is to develop a gasifier and process of gasification for a continuous steady state process. A lab scale downdraft gasifier was designed to develop the process and obtain optimum operating procedure. Sustainable and dependable sources such as biomass are potential sources of renewable energy and have a reasonable motivation to be used in developing a small scale energy production plant for countries such as Canada where wood stocks are more reliable sources than fossil fuels. This thesis addresses the process of thermal conversion of biomass gasification process in a downdraft reactor. Downdraft biomass gasifiers are relatively cheap and easy to operate because of their design. We constructed a simple biomass gasifier to study the steady state process for different sizes of the reactor. The experimental part of this investigation look at how operating conditions such as feed rate, air flow, the length of the bed, the vibration of the reactor, height and density of syngas flame in combustion flare changes for different sizes of the reactor. These experimental results also compare the trends of tar, char and syngas production for wood pellets in a steady state process. This study also includes biomass gasification process for different wood feedstocks. It compares how shape, size and moisture content of different feedstocks makes a difference in operating conditions for the gasification process. For this, Six Sigma DMAIC techniques were used to analyze and understand how each feedstock makes a significant impact on the process.
Supporting Teachers Learning Through the Collaborative Design of Technology-Enhanced Science Lessons
NASA Astrophysics Data System (ADS)
Kafyulilo, Ayoub C.; Fisser, Petra; Voogt, Joke
2015-12-01
This study used the Interconnected Model of Professional Growth (Clarke & Hollingsworth in Teaching and Teacher Education, 18, 947-967, 2002) to unravel how science teachers' technology integration knowledge and skills developed in a professional development arrangement. The professional development arrangement used Technological Pedagogical Content Knowledge as a conceptual framework and included collaborative design of technology-enhanced science lessons, implementation of the lessons and reflection on outcomes. Support to facilitate the process was offered in the form of collaboration guidelines, online learning materials, exemplary lessons and the availability of an expert. Twenty teachers participated in the intervention. Pre- and post-intervention results showed improvements in teachers' perceived and demonstrated knowledge and skills in integrating technology in science teaching. Collaboration guidelines helped the teams to understand the design process, while exemplary materials provided a picture of the product they had to design. The availability of relevant online materials simplified the design process. The expert was important in providing technological and pedagogical support during design and implementation, and reflected with teachers on how to cope with problems met during implementation.
Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning
ERIC Educational Resources Information Center
Peters, Vanessa L.; Songer, Nancy Butler
2013-01-01
This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school…
Effects of Cloud-Based m-Learning on Student Creative Performance in Engineering Design
ERIC Educational Resources Information Center
Chang, Yu-Shan; Chen, Si-Yi; Yu, Kuang-Chao; Chu, Yih-Hsien; Chien, Yu-Hung
2017-01-01
This study explored the effects of cloud-based m-learning on students' creative processes and products in engineering design. A nonequivalent pretest-posttest design was adopted, and 62 university students from Taipei City, Taiwan, were recruited as research participants in the study. The results showed that cloud-based m-learning had a positive…
Design-Driven Innovation as Seen in a Worldwide Values-Based Curriculum
ERIC Educational Resources Information Center
Hadlock, Camey Andersen; McDonald, Jason K.
2014-01-01
While instructional design's technological roots have given it many approaches for process and product improvement, in most cases designers still rely on instructional forms that do not allow them to develop instruction of a quality consistent with that expressed by the field's visionary leaders. As a result, often the teachers and students using…
Comprehensive assessment of the L-lysine production process from fermentation of sugarcane molasses.
Anaya-Reza, Omar; Lopez-Arenas, Teresa
2017-07-01
L-Lysine is an essential amino acid that can be produced by chemical processes from fossil raw materials, as well as by microbial fermentation, the latter being a more efficient and environmentally friendly procedure. In this work, the production process of L-lysine-HCl is studied using a systematic approach based on modeling and simulation, which supports decision making in the early stage of process design. The study considers two analysis stages: first, the dynamic analysis of the fermentation reactor, where the conversion of sugars from sugarcane molasses to L-lysine with a strain of Corynebacterium glutamicum is carried out. In this stage, the operation mode (either batch or fed batch) and operating conditions of the fermentation reactor are defined to reach the maximum technical criteria. Afterwards, the second analysis stage relates to the industrial production process of L-lysine-HCl, where the fermentation reactor, upstream processing, and downstream processing are included. In this stage, the influence of key parameters on the overall process performance is scrutinized through the evaluation of several technical, economic, and environmental criteria, to determine a profitable and sustainable design of the L-lysine production process. The main results show how the operating conditions, process design, and selection of evaluation criteria can influence in the conceptual design. The best plant design shows maximum product yield (0.31 g L-lysine/g glucose) and productivity (1.99 g/L/h), achieving 26.5% return on investment (ROI) with a payback period (PBP) of 3.8 years, decreasing water and energy consumption, and with a low potential environmental impact (PEI) index.
PCL foamed scaffolds loaded with 5-fluorouracil anti-cancer drug prepared by an eco-friendly route.
Salerno, Aurelio; Domingo, Concepción; Saurina, Javier
2017-06-01
This study describes a new preparation method, which combines freeze drying and supercritical CO 2 foaming approaches, for the preparation of drug delivery scaffolds of polycaprolactone loaded with 5-fluorouracil, an anti-cancer drug, with low solubility in scCO 2 . It is a principal objective of this work to design a scCO 2 strategy to reduce 5-Fu solubility limitations in its homogeneous distribution into a PCL scaffold through the design of an innovative processing method. The design of this process is considered valuable for the development of clean technology in pharmacy and medicine, since most of the active agents have a null solubility in scCO 2 ·Supercritical CO 2 is used as a blowing agent to induce polymer foaming by means of the low temperature pressure quench process. The resulting samples have been prepared under different operational conditions focused on enhancing the performance of the release process. In this case, design of experiments (DOE) was considered for a more comprehensive and systematic optimization of the product. In particular, drug amount, equals to 4.8 or 9.1wt%, process temperature, of 45 or 50°C and depressurization rate, equals to 0.1MPas -1 or 2MPas -1 were selected as the factors to be investigated by a three-factor at two-level full factorial design. Samples were characterized to establish porosity data, drug loading percentage and, especially, release profile chromatographically monitored. Results from DOE have concluded which are the best samples providing a sustained drug release for several days, which may be of great interest to develop materials for tissue engineering and sustained release applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly
2017-05-18
The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.
Surface acoustic wave resonators
NASA Astrophysics Data System (ADS)
Avitabile, Gianfranco; Roselli, Luca; Atzeni, Carlo; Manes, Gianfranco
1991-10-01
The development of surface acoustic wave (SAW) resonators is reviewed with attention given to the design of a simulation package for CAD-assisted SAW resonator design. Basic design configurations and operation parameters are set forth for the SAW resonators including the phase of the reflection factor, evaluation of the stopband center frequency, stopband width, and the free propagation speed. The use of synchronous designs is shown to reduce device sensitivity to variations in the technological process but generate higher insertion losses. The existence of transverse modes and propagation losses is shown to affect the rejection of spurious modes and the achievement of low insertion losses. Several SAW resonators are designed and fabricated with the CAD process, and the resonators in the VHF-UHF bands perform in a manner predicted by simulated results.
Methodological considerations in the design and implementation of clinical trials.
Cirrincione, Constance T; Lavoie Smith, Ellen M; Pang, Herbert
2014-02-01
To review study design issues related to clinical trials led by oncology nurses, with special attention to those conducted within the cooperative group setting; to emphasize the importance of the statistician's role in the process of clinical trials. Studies available at clinicaltrials.gov using experimental designs that have been published in peer-reviewed journals; cooperative group trials are highlighted. The clinical trial is a primary means to test intervention efficacy. A properly designed and powered study with clear and measurable objectives is as important as the intervention itself. Collaboration among the study team, including the statistician, is central in developing and conducting appropriately designed studies. For optimal results, collaboration is an ongoing process that should begin early on. Copyright © 2014 Elsevier Inc. All rights reserved.
Experiences with the hydraulic design of the high specific speed Francis turbine
NASA Astrophysics Data System (ADS)
Obrovsky, J.; Zouhar, J.
2014-03-01
The high specific speed Francis turbine is still suitable alternative for refurbishment of older hydro power plants with lower heads and worse cavitation conditions. In the paper the design process of such kind of turbine together with the results comparison of homological model tests performed in hydraulic laboratory of ČKD Blansko Engineering is introduced. The turbine runner was designed using the optimization algorithm and considering the high specific speed hydraulic profile. It means that hydraulic profiles of the spiral case, the distributor and the draft tube were used from a Kaplan turbine. The optimization was done as the automatic cycle and was based on a simplex optimization method as well as on a genetic algorithm. The number of blades is shown as the parameter which changes the resulting specific speed of the turbine between ns=425 to 455 together with the cavitation characteristics. Minimizing of cavitation on the blade surface as well as on the inlet edge of the runner blade was taken into account during the design process. The results of CFD analyses as well as the model tests are mentioned in the paper.
Linnell, Jessica D; Zidenberg-Cherr, Sheri; Briggs, Marilyn; Scherr, Rachel E; Brian, Kelley M; Hillhouse, Carol; Smith, Martin H
2016-01-01
To examine the use of a systematic approach and theoretical framework to develop an inquiry-based, garden-enhanced nutrition curriculum for the Shaping Healthy Choices Program. Curriculum development occurred in 3 steps: identification of learning objectives, determination of evidence of learning, and activity development. Curriculum activities were further refined through pilot-testing, which was conducted in 2 phases. Formative data collected during pilot-testing resulted in improvements to activities. Using a systematic, iterative process resulted in a curriculum called Discovering Healthy Choices, which has a strong foundation in Social Cognitive Theory and constructivist learning theory. Furthermore, the Backward Design method provided the design team with a systematic approach to ensure activities addressed targeted learning objectives and overall Shaping Healthy Choices Program goals. The process by which a nutrition curriculum is developed may have a direct effect on student outcomes. Processes by which nutrition curricula are designed and learning objectives are selected, and how theory and pedagogy are applied should be further investigated so that effective approaches to developing garden-enhanced nutrition interventions can be determined and replicated. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Annular beam shaping system for advanced 3D laser brazing
NASA Astrophysics Data System (ADS)
Pütsch, Oliver; Stollenwerk, Jochen; Kogel-Hollacher, Markus; Traub, Martin
2012-10-01
As laser brazing benefits from advantages such as smooth joints and small heat-affected zones, it has become established as a joining technology that is widely used in the automotive industry. With the processing of complex-shaped geometries, recent developed brazing heads suffer, however, from the need for continuous reorientation of the optical system and/or limited accessibility due to lateral wire feeding. This motivates the development of a laser brazing head with coaxial wire feeding and enhanced functionality. An optical system is designed that allows to generate an annular intensity distribution in the working zone. The utilization of complex optical components avoids obscuration of the optical path by the wire feeding. The new design overcomes the disadvantages of the state-of-the-art brazing heads with lateral wire feeding and benefits from the independence of direction while processing complex geometries. To increase the robustness of the brazing process, the beam path also includes a seam tracking system, leading to a more challenging design of the whole optical train. This paper mainly discusses the concept and the optical design of the coaxial brazing head, and also presents the results obtained with a prototype and selected application results.
NASA Astrophysics Data System (ADS)
Lebedev, V. A.; Serga, G. V.; Khandozhko, A. V.
2018-03-01
The article proposes technical solutions for increasing the efficiency of finishing-cleaning and hardening processing of parts on the basis of rotor-screw technological systems. The essence, design features and technological capabilities of the rotor-screw technological system with a rotating container are disclosed, which allows one to expand the range of the resulting displacement vectors, granules of the abrasive medium and processed parts. Ways of intensification of the processing on their basis by means of vibration activation of the process providing a combined effect on the mass of loading of large and small amplitude low-frequency oscillations are proposed. The results of the experimental studies of the movement of bulk materials in a screw container are presented, which showed that Kv = 0.5-0.6 can be considered the optimal value of the container filling factor. The estimation of screw containers application efficiency proceeding from their design features is given.
Inflatable antenna for earth observing systems
NASA Astrophysics Data System (ADS)
Wang, Hong-Jian; Guan, Fu-ling; Xu, Yan; Yi, Min
2010-09-01
This paper describe mechanical design, dynamic analysis, and deployment demonstration of the antenna , and the photogrammetry detecting RMS of inflatable antenna surface, the possible errors results form the measurement are also analysed. Ticra's Grasp software are used to predict the inflatable antenna pattern based on the coordinates of the 460 points on the parabolic surface, the final results verified the whole design process.
An Alternative View of Some FIA Sample Design and Analysis Issues
Paul C. Van Deusen
2005-01-01
Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...
NASA Technical Reports Server (NTRS)
Zeiler, Thomas A.; Pototzky, Anthony S.
1989-01-01
A theoretical basis and example calculations are given that demonstrate the relationship between the Matched Filter Theory approach to the calculation of time-correlated gust loads and Phased Design Load Analysis in common use in the aerospace industry. The relationship depends upon the duality between Matched Filter Theory and Random Process Theory and upon the fact that Random Process Theory is used in Phased Design Loads Analysis in determining an equiprobable loads design ellipse. Extensive background information describing the relevant points of Phased Design Loads Analysis, calculating time-correlated gust loads with Matched Filter Theory, and the duality between Matched Filter Theory and Random Process Theory is given. It is then shown that the time histories of two time-correlated gust load responses, determined using the Matched Filter Theory approach, can be plotted as parametric functions of time and that the resulting plot, when superposed upon the design ellipse corresponding to the two loads, is tangent to the ellipse. The question is raised of whether or not it is possible for a parametric load plot to extend outside the associated design ellipse. If it is possible, then the use of the equiprobable loads design ellipse will not be a conservative design practice in some circumstances.
NASA Astrophysics Data System (ADS)
Sukendar, Irwan; Fatmawati, Wiwiek; Much Ibnu Subroto, Imam; Arigama, Rizki
2017-04-01
This paper studies the design of business system model with System Modeling Approach on small and medium enterprises (SMEs) of furniture. Methods used consists of five phases: phase of identification of business processes actual on SMEs of Furniture, phase of identification of deficiencies and improvement of business processes, phase of design algorithm and flowchart business processes, phase of analysis of the elements of the system, and phase of the design of data flow diagram (DFD), The results of the analysis of the elements of the system are: Products and quantities ordered product consumers and DP paid by consumers identified as elements of system inputs 1,2 and 3. The result of the calculation, payment slips and mail order (SO) are identified as elements of system output 1, 2 and 3. Acceptance of orders, stocks checking of raw materials at the warehouse, calculating raw material requirements, adequacy of raw materials, the price of the contract, and the due date, as well as the submission of the results of calculations to consumers were identified as elements of system components 1, 2, 3, and 4. Admin taking orders, Admin check stocks of raw materials at the warehouse, Admin making calculation, and Admin convey the results of calculations to consumers were identified as an element of interaction system 1, 2, 3, and 4. Consumers were identified as element of environmental systems. Moreover, the boundary between SMEs and consumers were identified as elements of the system boundary.
Analysis and Test Correlation of Proof of Concept Box for Blended Wing Body-Low Speed Vehicle
NASA Technical Reports Server (NTRS)
Spellman, Regina L.
2003-01-01
The Low Speed Vehicle (LSV) is a 14.2% scale remotely piloted vehicle of the revolutionary Blended Wing Body concept. The design of the LSV includes an all composite airframe. Due to internal manufacturing capability restrictions, room temperature layups were necessary. An extensive materials testing and manufacturing process development effort was underwent to establish a process that would achieve the high modulus/low weight properties required to meet the design requirements. The analysis process involved a loads development effort that incorporated aero loads to determine internal forces that could be applied to a traditional FEM of the vehicle and to conduct detailed component analyses. A new tool, Hypersizer, was added to the design process to address various composite failure modes and to optimize the skin panel thickness of the upper and lower skins for the vehicle. The analysis required an iterative approach as material properties were continually changing. As a part of the material characterization effort, test articles, including a proof of concept wing box and a full-scale wing, were fabricated. The proof of concept box was fabricated based on very preliminary material studies and tested in bending, torsion, and shear. The box was then tested to failure under shear. The proof of concept box was also analyzed using Nastran and Hypersizer. The results of both analyses were scaled to determine the predicted failure load. The test results were compared to both the Nastran and Hypersizer analytical predictions. The actual failure occurred at 899 lbs. The failure was predicted at 1167 lbs based on the Nastran analysis. The Hypersizer analysis predicted a lower failure load of 960 lbs. The Nastran analysis alone was not sufficient to predict the failure load because it does not identify local composite failure modes. This analysis has traditionally been done using closed form solutions. Although Hypersizer is typically used as an optimizer for the design process, the failure prediction was used to help gain acceptance and confidence in this new tool. The correlated models and process were to be used to analyze the full BWB-LSV airframe design. The analysis and correlation with test results of the proof of concept box is presented here, including the comparison of the Nastran and Hypersizer results.
NASA Astrophysics Data System (ADS)
Ayad, G.; Song, J.; Barriere, T.; Liu, B.; Gelin, J. C.
2007-05-01
The paper is concerned with optimization and parametric identification of Powder Injection Molding process that consists first in injection of powder mixture with polymer binder and then to the sintering of the resulting powders parts by solid state diffusion. In the first part, one describes an original methodology to optimize the injection stage based on the combination of Design Of Experiments and an adaptive Response Surface Modeling. Then the second part of the paper describes the identification strategy that one proposes for the sintering stage, using the identification of sintering parameters from dilatometer curves followed by the optimization of the sintering process. The proposed approaches are applied to the optimization for manufacturing of a ceramic femoral implant. One demonstrates that the proposed approach give satisfactory results.
Work process and task-based design of intelligent assistance systems in German textile industry
NASA Astrophysics Data System (ADS)
Löhrer, M.; Ziesen, N.; Altepost, A.; Saggiomo, M.; Gloy, Y. S.
2017-10-01
The mid-sized embossed German textile industry must face social challenges e.g. demographic change or technical changing processes. Interaction with intelligent systems (on machines) and increasing automation changes processes, working structures and employees’ tasks on all levels. Work contents are getting more complex, resulting in the necessity for diversified and enhanced competencies. Mobile devices like tablets or smartphones are increasingly finding their way into the workplace. Employees who grew up with new forms of media have certain advantages regarding the usage of modern technologies compared to older employees. Therefore, it is necessary to design new systems which help to adapt the competencies of both younger and older employees to new automated production processes in the digital work environment. The key to successful integration of technical assistance systems is user-orientated design and development that includes concepts for competency development under consideration of, e.g., ethical and legal aspects.
NASA Astrophysics Data System (ADS)
Najafi, Ali; Acar, Erdem; Rais-Rohani, Masoud
2014-02-01
The stochastic uncertainties associated with the material, process and product are represented and propagated to process and performance responses. A finite element-based sequential coupled process-performance framework is used to simulate the forming and energy absorption responses of a thin-walled tube in a manner that both material properties and component geometry can evolve from one stage to the next for better prediction of the structural performance measures. Metamodelling techniques are used to develop surrogate models for manufacturing and performance responses. One set of metamodels relates the responses to the random variables whereas the other relates the mean and standard deviation of the responses to the selected design variables. A multi-objective robust design optimization problem is formulated and solved to illustrate the methodology and the influence of uncertainties on manufacturability and energy absorption of a metallic double-hat tube. The results are compared with those of deterministic and augmented robust optimization problems.
Using Design Capability Indices to Satisfy Ranged Sets of Design Requirements
NASA Technical Reports Server (NTRS)
Chen, Wei; Allen, Janet K.; Simpson, Timothy W.; Mistree, Farrokh
1996-01-01
For robust design it is desirable to allow the design requirements to vary within a certain range rather than setting point targets. This is particularly important during the early stages of design when little is known about the system and its requirements. Toward this end, design capability indices are developed in this paper to assess the capability of a family of designs, represented by a range of top-level design specifications, to satisfy a ranged set of design requirements. Design capability indices are based on process capability indices from statistical process control and provide a single objective, alternate approach to the use of Taguchi's signal-to- noise ratio which is often used for robust design. Successful implementation of design capability indices ensures that a family of designs conforms to a given ranged set of design requirements. To demonstrate an application and the usefulness of design capability indices, the design of a solar powered irrigation system is presented. Our focus in this paper is on the development and implementation of design capability indices as an alternate approach to the use of the signal-to-noise ratio and not on the results of the example problem, per se.
Spitzer Telemetry Processing System
NASA Technical Reports Server (NTRS)
Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.
2013-01-01
The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.
Guetterman, Timothy C.; Fetters, Michael D.; Legocki, Laurie J.; Mawocha, Samkeliso; Barsan, William G.; Lewis, Roger J.; Berry, Donald A.; Meurer, William J.
2015-01-01
Context The context for this study was the Adaptive Designs Advancing Promising Treatments Into Trials (ADAPT-IT) project, which aimed to incorporate flexible adaptive designs into pivotal clinical trials and to conduct an assessment of the trial development process. Little research provides guidance to academic institutions in planning adaptive trials. Objectives The purpose of this qualitative study was to explore the perspectives and experiences of stakeholders as they reflected back about the interactive ADAPT-IT adaptive design development process, and to understand their perspectives regarding lessons learned about the design of the trials and trial development. Materials and methods We conducted semi-structured interviews with ten key stakeholders and observations of the process. We employed qualitative thematic text data analysis to reduce the data into themes about the ADAPT-IT project and adaptive clinical trials. Results The qualitative analysis revealed four themes: education of the project participants, how the process evolved with participant feedback, procedures that could enhance the development of other trials, and education of the broader research community. Discussion and conclusions While participants became more likely to consider flexible adaptive designs, additional education is needed to both understand the adaptive methodology and articulate it when planning trials. PMID:26622163
IPAD: Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
1980-01-01
The conference was organized to promote wider awareness of the IPAD program and its coming impact on American industry. The program focuses on technology issues that are critical to computer aided design manufacturing. Included is a description of a representative aerospace design process and its interface with manufacturing, the design of a future IPAD integrated computer aided design system, results to date in developing IPAD products and associated technology, and industry experiences and plans to exploit these products.
Numerical aerodynamic simulation facility preliminary study, volume 2 and appendices
NASA Technical Reports Server (NTRS)
1977-01-01
Data to support results obtained in technology assessment studies are presented. Objectives, starting points, and future study tasks are outlined. Key design issues discussed in appendices include: data allocation, transposition network design, fault tolerance and trustworthiness, logic design, processing element of existing components, number of processors, the host system, alternate data base memory designs, number representation, fast div 521 instruction, architectures, and lockstep array versus synchronizable array machine comparison.
An Interactive Design Space Supporting Development of Vehicle Architecture Concept Models
2011-01-01
Denver, Colorado, USA IMECE2011-64510 AN INTERACTIVE DESIGN SPACE SUPPORTING DEVELOPMENT OF VEHICLE ARCHITECTURE CONCEPT MODELS Gary Osborne...early in the development cycle. Optimization taking place later in the cycle usually occurs at the detail design level, and tends to result in...architecture changes may be imposed, but such modifications are equivalent to a huge optimization cycle covering almost the entire design process, and
Development of Integrated Programs for Aerospace-Vehicle Design (IPAD) - IPAD user requirements
NASA Technical Reports Server (NTRS)
Anderton, G. L.
1979-01-01
Results of a requirements analysis task for Integrated Programs for Aerospace Vehicle Design (IPAD) are presented. User requirements which, in part, will shape the IPAD system design are given. Requirements considered were: generation, modification, storage, retrieval, communication, reporting, and protection of information. Data manipulation and controls on the system and the information were also considered. Specific needs relative to the product design process are also discussed.
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design, and Data Management
2014-01-01
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design , and Data Management Kevin W. Williams1 Bonny Christopher2 Gena...Simulation Studies: Experimental Planning, Design , and Data Management January 2014 6. Performing Organization Code 7. Author(s) 8. Performing...describe the process by which we designed our human-in-the-loop (HITL) simulation study and the methodology used to collect and analyze the results
Process characteristics and design methods for a 300 deg quad OP amp
NASA Technical Reports Server (NTRS)
Beasom, J. D.; Patterson, R. B., III
1981-01-01
The results of process characterization, circuit design, and reliability studies for the development of a quad OP amplifier intended for use up to 300 C are presented. A dielectrically isolated complementary vertical bipolar process was chosen to fabricate the amplifier in order to eliminate isolation leakage and the possibility of latch up. Characterization of NPN and PNP junctions showed them to be suitable for use up to 300 C. Interconnect reliability was predicted to be greater than four years mean time between failure. Parasitic MOS formation was eliminated by isolation of each device.
Cahyadi, Christine; Heng, Paul Wan Sia; Chan, Lai Wah
2011-03-01
The aim of this study was to identify and optimize the critical process parameters of the newly developed Supercell quasi-continuous coater for optimal tablet coat quality. Design of experiments, aided by multivariate analysis techniques, was used to quantify the effects of various coating process conditions and their interactions on the quality of film-coated tablets. The process parameters varied included batch size, inlet temperature, atomizing pressure, plenum pressure, spray rate and coating level. An initial screening stage was carried out using a 2(6-1(IV)) fractional factorial design. Following these preliminary experiments, optimization study was carried out using the Box-Behnken design. Main response variables measured included drug-loading efficiency, coat thickness variation, and the extent of tablet damage. Apparent optimum conditions were determined by using response surface plots. The process parameters exerted various effects on the different response variables. Hence, trade-offs between individual optima were necessary to obtain the best compromised set of conditions. The adequacy of the optimized process conditions in meeting the combined goals for all responses was indicated by the composite desirability value. By using response surface methodology and optimization, coating conditions which produced coated tablets of high drug-loading efficiency, low incidences of tablet damage and low coat thickness variation were defined. Optimal conditions were found to vary over a large spectrum when different responses were considered. Changes in processing parameters across the design space did not result in drastic changes to coat quality, thereby demonstrating robustness in the Supercell coating process. © 2010 American Association of Pharmaceutical Scientists
Meegan, Daniel V; Honsberger, Michael J M
2005-05-01
Many neuroimaging studies have been designed to differentiate domain-specific processes in the brain. A common design constraint is to use identical stimuli for different domain-specific tasks. For example, an experiment investigating spatial versus identity processing would present compound spatial-identity stimuli in both spatial and identity tasks, and participants would be instructed to attend to, encode, maintain, or retrieve spatial information in the spatial task, and identity information in the identity task. An assumption in such studies is that spatial information will not be processed in the identity task, as it is irrelevant for that task. We report three experiments demonstrating violations of this assumption. Our results suggest that comparisons of spatial and identity tasks in existing neuroimaging studies have underestimated the amount of brain activation that is spatial-specific. For future neuroimaging studies, we recommend unique stimulus displays for each domain-specific task, and event-related measurement of post-stimulus processing.
Agile manufacturing: The factory of the future
NASA Technical Reports Server (NTRS)
Loibl, Joseph M.; Bossieux, Terry A.
1994-01-01
The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.
A Module Experimental Process System Development Unit (MEPSDU)
NASA Technical Reports Server (NTRS)
1981-01-01
Design work for a photovoltaic module, fabricated using single crystal silicon dendritic web sheet material, resulted in the identification of surface treatment to the module glass superstrate which improved module efficiencies. A final solar module environmental test, a simulated hailstone impact test, was conducted on full size module superstrates to verify that the module's tempered glass superstrate can withstand specified hailstone impacts near the corners and edges of the module. Process sequence design work on the metallization process selective, liquid dopant investigation, dry processing, and antireflective/photoresist application technique tasks, and optimum thickness for Ti/Pd are discussed. A noncontact cleaning method for raw web cleaning was identified and antireflective and photoresist coatings for the dendritic webs were selected. The design of a cell string conveyor, an interconnect feed system, rolling ultrasonic spot bonding heat, and the identification of the optimal commercially available programmable control system are also discussed. An economic analysis to assess cost goals of the process sequence is also given.
40 CFR 93.107 - Relationship of transportation plan and TIP conformity with the NEPA process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... degree of specificity required in the transportation plan and the specific travel network assumed for air... development studies. Should the NEPA process result in a project with design concept and scope significantly...
NASA Technical Reports Server (NTRS)
Withey, James V.
1986-01-01
The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.
A Framework for Automating Cost Estimates in Assembly Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calton, T.L.; Peters, R.R.
1998-12-09
When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process maymore » iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.« less
Jupiter Icy Moons Orbiter Mission design overview
NASA Technical Reports Server (NTRS)
Sims, Jon A.
2006-01-01
An overview of the design of a possible mission to three large moons of Jupiter (Callisto, Ganymede, and Europa) is presented. The potential Jupiter Icy Moons Orbiter (JIMO) mission uses ion thrusters powered by a nuclear reactor to transfer from Earth to Jupiter and enter a low-altitude science orbit around each of the moons. The combination of very limited control authority and significant multibody dynamics resulted in some aspects of the trajectory design being different than for any previous mission. The results of several key trades, innovative trajectory types and design processes, and remaining issues are presented.
NASA Technical Reports Server (NTRS)
Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.
1985-01-01
Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.
Dynamic interactions of atmospheric and hydrological processes result in large spatiotemporal changes of precipitation and wind speed in coastal storm events under both current and future climates. This variability can impact the design and sustainability of water infrastructure ...
The Mark III Hypercube-Ensemble Computers
NASA Technical Reports Server (NTRS)
Peterson, John C.; Tuazon, Jesus O.; Lieberman, Don; Pniel, Moshe
1988-01-01
Mark III Hypercube concept applied in development of series of increasingly powerful computers. Processor of each node of Mark III Hypercube ensemble is specialized computer containing three subprocessors and shared main memory. Solves problem quickly by simultaneously processing part of problem at each such node and passing combined results to host computer. Disciplines benefitting from speed and memory capacity include astrophysics, geophysics, chemistry, weather, high-energy physics, applied mechanics, image processing, oil exploration, aircraft design, and microcircuit design.
Nonlinear Shaping Architecture Designed with Using Evolutionary Structural Optimization Tools
NASA Astrophysics Data System (ADS)
Januszkiewicz, Krystyna; Banachowicz, Marta
2017-10-01
The paper explores the possibilities of using Structural Optimization Tools (ESO) digital tools in an integrated structural and architectural design in response to the current needs geared towards sustainability, combining ecological and economic efficiency. The first part of the paper defines the Evolutionary Structural Optimization tools, which were developed specifically for engineering purposes using finite element analysis as a framework. The development of ESO has led to several incarnations, which are all briefly discussed (Additive ESO, Bi-directional ESO, Extended ESO). The second part presents result of using these tools in structural and architectural design. Actual building projects which involve optimization as a part of the original design process will be presented (Crematorium in Kakamigahara Gifu, Japan, 2006 SANAA“s Learning Centre, EPFL in Lausanne, Switzerland 2008 among others). The conclusion emphasizes that the structural engineering and architectural design mean directing attention to the solutions which are used by Nature, designing works optimally shaped and forming their own environments. Architectural forms never constitute the optimum shape derived through a form-finding process driven only by structural optimization, but rather embody and integrate a multitude of parameters. It might be assumed that there is a similarity between these processes in nature and the presented design methods. Contemporary digital methods make the simulation of such processes possible, and thus enable us to refer back to the empirical methods of previous generations.
Decoupling Coupled Constraints Through Utility Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, N; Marden, JR
2014-08-01
Several multiagent systems exemplify the need for establishing distributed control laws that ensure the resulting agents' collective behavior satisfies a given coupled constraint. This technical note focuses on the design of such control laws through a game-theoretic framework. In particular, this technical note provides two systematic methodologies for the design of local agent objective functions that guarantee all resulting Nash equilibria optimize the system level objective while also satisfying a given coupled constraint. Furthermore, the designed local agent objective functions fit into the framework of state based potential games. Consequently, one can appeal to existing results in game-theoretic learning tomore » derive a distributed process that guarantees the agents will reach such an equilibrium.« less
On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process
NASA Astrophysics Data System (ADS)
Hongzhi, Zhao; Jian, Zhang
2018-03-01
The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.
Computer assisted blast design and assessment tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.
1995-12-31
In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing;more » evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.« less
Design Of Computer Based Test Using The Unified Modeling Language
NASA Astrophysics Data System (ADS)
Tedyyana, Agus; Danuri; Lidyawati
2017-12-01
The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.
Designing indonesian teacher engagement index (itei) applications based on android
NASA Astrophysics Data System (ADS)
Manalu, S. R.; Sasmoko; Permai, S. D.; Widhoyoko, S. A.; Indrianti, Y.
2018-03-01
Teachers who have a good level of engagement will be able to produce students who engage and excel. Level of national teachers’ engagement needs to be a reference to the level of educational success and equity of national education. The spread of geographically inaccessible Indonesian teachers is a barrier to these measurements. ITEI Android application developed by analysing the geographical problem, so that each teacher can participate wherever they are. The ITEI app is designed by implementing Android on the client side and load balancer on the server side. Android ITEI will feature a number of questions questionnaire to teachers. Meanwhile, the load balancer will distribute the answers to each server for processing. Load Balancer ensures fast data processing and minimize server failure. The results of the processing on the server will be sent back to Android in the form of profiling themselves ITEI teachers. While the data obtained and stored in the server can be used to measure the level of national teachers’ engagement. The result of this research is the design of ITEI application ready to be implemented in order to support the data collection process of teacher national engagement level.
Collaborating with Youth to Inform and Develop Tools for Psychotropic Decision Making
Murphy, Andrea; Gardner, David; Kutcher, Stan; Davidson, Simon; Manion, Ian
2010-01-01
Introduction: Youth oriented and informed resources designed to support psychopharmacotherapeutic decision-making are essentially unavailable. This article outlines the approach taken to design such resources, the product that resulted from the approach taken, and the lessons learned from the process. Methods: A project team with psychopharmacology expertise was assembled. The project team reviewed best practices regarding medication educational materials and related tools to support decisions. Collaboration with key stakeholders who were thought of as primary end-users and target groups occurred. A graphic designer and a plain language consultant were also retained. Results: Through an iterative and collaborative process over approximately 6 months, Med Ed and Med Ed Passport were developed. Literature and input from key stakeholders, in particular youth, was instrumental to the development of the tools and materials within Med Ed. A training program utilizing a train-the-trainer model was developed to facilitate the implementation of Med Ed in Ontario, which is currently ongoing. Conclusion: An evidence-informed process that includes youth and key stakeholder engagement is required for developing tools to support in psychopharmacotherapeutic decision-making. The development process fostered an environment of reciprocity between the project team and key stakeholders. PMID:21037916
Tuning algorithms for fractional order internal model controllers for time delay processes
NASA Astrophysics Data System (ADS)
Muresan, Cristina I.; Dutta, Abhishek; Dulf, Eva H.; Pinar, Zehra; Maxim, Anca; Ionescu, Clara M.
2016-03-01
This paper presents two tuning algorithms for fractional-order internal model control (IMC) controllers for time delay processes. The two tuning algorithms are based on two specific closed-loop control configurations: the IMC control structure and the Smith predictor structure. In the latter, the equivalency between IMC and Smith predictor control structures is used to tune a fractional-order IMC controller as the primary controller of the Smith predictor structure. Fractional-order IMC controllers are designed in both cases in order to enhance the closed-loop performance and robustness of classical integer order IMC controllers. The tuning procedures are exemplified for both single-input-single-output as well as multivariable processes, described by first-order and second-order transfer functions with time delays. Different numerical examples are provided, including a general multivariable time delay process. Integer order IMC controllers are designed in each case, as well as fractional-order IMC controllers. The simulation results show that the proposed fractional-order IMC controller ensures an increased robustness to modelling uncertainties. Experimental results are also provided, for the design of a multivariable fractional-order IMC controller in a Smith predictor structure for a quadruple-tank system.
Metal powder production by gas atomization
NASA Technical Reports Server (NTRS)
Ting, E. Y.; Grant, N. J.
1986-01-01
The confined liquid, gas-atomization process was investigated. Results from a two-dimensional water model showed the importance of atomization pressure, as well as delivery tube and atomizer design. The atomization process at the tip of the delivery tube was photographed. Results from the atomization of a modified 7075 aluminum alloy yielded up to 60 wt pct. powders that were finer than 45 microns in diameter. Two different atomizer designs were evaluated. The amount of fine powders produced was correlated to a calculated gas-power term. An optimal gas-power value existed for maximized fine powder production. Atomization at gas-power greater than or less than this optimal value produced coarser powders.
Optimization of extraction of chitin from procambarus clarkia shell by Box-Behnken design
NASA Astrophysics Data System (ADS)
Dong, Fang; Qiu, Hailong; Jia, Shaoqian; Dai, Cuiping; Kong, Qingxin; Xu, Changliang
2018-06-01
This paper investigated the optimizing extraction processing of chitin from procambarus clarkia shell by Box-Behnken design. Firstly, four independent variables were explored in single factor experiments, namely, concentration of hydrochloric acid, soaking time, concentration of sodium hydroxide and reaction time. Then, based on the results of the above experiments, four factors and three levels experiments were planned by Box-Behnken design. According to the experimental results, we harvested a second-order polynomial equation using multiple regression analysis. In addition, the optimum extraction process of chitin of the model was obtained: concentration of HCl solution 1.54mol/L, soaking time 19.87h, concentration of NaOH solution 2.9mol/L and reaction time 3.54h. For proving the accuracy of the model, we finished the verification experiment under the following conditions: concentration of hydrochloric acid 1.5mol/L, soaking time 20h, concentration of sodium hydroxide 3mol/L and reaction time 3.5h. The actual yield of chitin reached 18.76%, which was very close to the predicted yield (18.66%) of the model. The result indicated that the optimum extraction processing of chitin was feasible and practical.
Silicon production process evaluations
NASA Technical Reports Server (NTRS)
1982-01-01
Engineering design of the third distillation column in the process was accomplished. The initial design is based on a 94.35% recovery of dichlorosilane in the distillate and a 99.9% recovery of trichlorosilane in the bottoms. The specified separation is achieved at a reflux ratio of 15 with 20 trays (equilibrium stages). Additional specifications and results are reported including equipment size, temperatures and pressure. Specific raw material requirements necessary to produce the silicon in the process are presented. The primary raw materials include metallurgical grade silicon, silicon tetrachloride, hydrogen, copper (catalyst) and lime (waste treatment). Hydrogen chloride is produced as by product in the silicon deposition. Cost analysis of the process was initiated during this reporting period.
Integrating Human Factors into Crew Exploration Vehicle (CEV) Design
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Holden, Kritina; Baggerman, Susan; Campbell, Paul
2007-01-01
The purpose of this design process is to apply Human Engineering (HE) requirements and guidelines to hardware/software and to provide HE design, analysis and evaluation of crew interfaces. The topics include: 1) Background/Purpose; 2) HE Activities; 3) CASE STUDY: Net Habitable Volume (NHV) Study; 4) CASE STUDY: Human Modeling Approach; 5) CASE STUDY: Human Modeling Results; 6) CASE STUDY: Human Modeling Conclusions; 7) CASE STUDY: Human-in-the-Loop Evaluation Approach; 8) CASE STUDY: Unsuited Evaluation Results; 9) CASE STUDY: Suited Evaluation Results; 10) CASE STUDY: Human-in-the-Loop Evaluation Conclusions; 11) Near-Term Plan; and 12) In Conclusion
The vectorization of a ray tracing program for image generation
NASA Technical Reports Server (NTRS)
Plunkett, D. J.; Cychosz, J. M.; Bailey, M. J.
1984-01-01
Ray tracing is a widely used method for producing realistic computer generated images. Ray tracing involves firing an imaginary ray from a view point, through a point on an image plane, into a three dimensional scene. The intersections of the ray with the objects in the scene determines what is visible at the point on the image plane. This process must be repeated many times, once for each point (commonly called a pixel) in the image plane. A typical image contains more than a million pixels making this process computationally expensive. A traditional ray tracing program processes one ray at a time. In such a serial approach, as much as ninety percent of the execution time is spent computing the intersection of a ray with the surface in the scene. With the CYBER 205, many rays can be intersected with all the bodies im the scene with a single series of vector operations. Vectorization of this intersection process results in large decreases in computation time. The CADLAB's interest in ray tracing stems from the need to produce realistic images of mechanical parts. A high quality image of a part during the design process can increase the productivity of the designer by helping him visualize the results of his work. To be useful in the design process, these images must be produced in a reasonable amount of time. This discussion will explain how the ray tracing process was vectorized and gives examples of the images obtained.